Configure python sub-packages - python

I'm trying to set sub-packages for a python project. Please refer to the structure below. The main setup.py will call "setup.py" in each sub packages.
my_project
├── my_sub_package1
│ ├── foo2.py
│ ├── foo.py
│ └── setup.py
├── my_sub_package2
│ ├── bar2.py
│ ├── bar.py
│ └── setup.py
└── setup.py [main]
With this structure, in other projects, if the user only needs a sub_package, the user can choose to install "my_sub_package1" only, instead of installing the whole package (which can become bulky over time as number of packages increases).
Does anyone know if this is the correct way of doing it? Thanks!

Related

How to structure an AWS Lambda with python in subfolders?

I'm struggling in figuring out how to properly set up a python Lambda in a subdirectory and have imports work correctly. For example, if you don't put your python in the root folder as recommended by AWS and instead, put it in a src folder with a lambda_handler.py that is the main handler in there and then packages/folders inside that, so you might have src/api, as an example. I am using the new SAM accelerate and they acknowledge a bug where it doesn't ignore the .aws-sam folder, so it will infinitely loop with the project in root, so they recommend a subfolder, but that greatly complicates things with Python, apparently.
I think I properly figured out how to get it to properly to read my own packages and modules in subfolders using init.py, but I can't get my requirements.txt to install, so they don't show up in the local build or the cloud build. I have found quite a few StackOverflows that are seemingly on the subject, but none of them seem to work for me or give an example that I can follow that works. The following is my structure:
/
.aws-sam
└── src
├── app_folder_1
│ ├── __init__.py
│ ├── example1.py
├── lambda_handler.py
├── app_folder_2
│ ├── __init__.py
│ ├── example2.py
├── requirements.txt
└── template.yml
I have an import of pymysql as an example and my dependencies for requirements.txt are never installed, so pymysql never is found. I feel like things shouldn't be this difficult. Can anyone assist?
UPDATE: I may have figured out the issue, which this post gave me a cluse to https://github.com/aws/serverless-application-model/issues/1927 It appears that sam invoke local has the same issue with custom templates -- something I was utilizing -- and that isn't very intuitive, even though they claim that is working as intended.
UPDATE 2: Definitely assisted with my progress, but it still isn't working as intended.
I forgot to return to this, so I figured I would post this here to assist others that may be struggling as Python is really finicky.
The following is the structure of a functional project in Lambda:
├── README.md
├── buildspec.yml
├── lambda_handler.py
├── locales
│ ├── en
│ │ └── LC_MESSAGES
│ │ └── base.po
│ ├── es
│ │ └── LC_MESSAGES
│ │ └── base.po
│ ├── fr
│ │ └── LC_MESSAGES
│ │ └── base.po
├── my_project_name
│ ├── __init__.py
│ ├── python_file_1.py
│ ├── python_file_2.py
│ ├── python_file_3.py
│ ├── repositories
│ │ ├── __init__.py
│ │ ├── resolvers_1.py
│ │ ├── resolvers_2.py
├── requirements-dev.txt
├── requirements.txt
├── template.yml
└── tests
├── __init__.py
├── tests_file_1.py
├── tests_file_2.py
├── tests_file_3.py
└── tests_file_4.py
Note that the base.mo files are generated in the build in the same location as the base.po files and that requirements-dev.txt has dev-only libraries, such as pep8 for formatting and is only run within my local conda environments [I set up virtual environments for each project using miniconda]. The purpose of the resolvers in the nested repositories folder is to separate out the controller-like python files from the calls to the repositories to retrieve data.

python alternative package setup using setuptools

I am having some trouble adding packages with my particular setup:
.
├── pkg_a
│ ├── pkg_a
│ │ ├── __init__.py
│ │ └── module_a.py
│ └── run_a.py
├── pkg_b
│ ├── pkg_b
│ │ ├── __init__.py
│ │ └── module_b.py
│ └── run_b.py
└── setup.py
My goal is to be able to import package modules without repeating package name twice.
For example, in run_a.py I'd like to be able to call from pkg_a import module_a instead of calling from pkg_a.pkg_a import module_a
I tried to follow Section 2.1 of doc here. By creating setup.py as follow:
from setuptools import setup
setup(
name="test",
packages=['pkg_a', 'pkg_b'],
package_dir={'pkg_a':'pkg_a/pkg_a', 'pkg_b':'pkg_b/pkg_b'}
)
But this does not achieve the desired effect as mentioned above as I tried to call python setup.py develop and then python -c 'from pkg_a import module_a'.
Is this particular setup achievable? And what am I messing up here? Thanks all!
package_dir modifications do not work with editable (aka develop) installations. The only acceptable package_dir modification for editable installations is the one that covers the so-called src-layout:
package_dir={'': 'src'},

Python package-namespace: common test/docs/setup.py or one per namespace - which is the better pattern?

(In the interest of transparency, this is a follow up to a question asked here)
I'm dealing with related files in which a namespace package seems a good fit. I'm following the guide from the packaging authority, which places a setup.py in each namespace package;
mynamespace-subpackage-a/
setup.py
mynamespace/
subpackage_a/
__init__.py
mynamespace-subpackage-b/
setup.py
mynamespace/
subpackage_b/
__init__.py
module_b.py
In my tests, created a similar project. Apart from setup.py, I placed my unit tests, docs, and other stuff per namespace (I left out some of the directories for compactness.). I used pyscaffold to generate the namespaces.
├── namespace-package-test.package1
│ ├── LICENSE.txt
│ ├── README.md
│ ├── setup.cfg
│ ├── setup.py
│ ├── src
│ │ └── pkg1
│ │ ├── cli
│ │ │ ├── __init__.py
│ │ │ └── pkg1_cli.py
│ │ └── __init__.py
│ └── tests
├── namespace-package-test.package2
│ ├── AUTHORS.rst
However, I then noticed that pyscaffold has the option to create namespaces packages in the putup command.
(venv) steve#PRVL10SJACKSON:~/Temp$ putup --force my-package -p pkg1 --namespace namespace1
(venv) steve#PRVL10SJACKSON:~/Temp$ putup --force my-package -p pkg1 --namespace namespace2
This creates a folder structure like this;
├── AUTHORS.rst
├── CHANGELOG.rst
├── LICENSE.txt
├── README.rst
├── requirements.txt
├── setup.cfg
├── setup.py
├── src
│   ├── namespace1
│   │   ├── __init__.py
│   │   └── pkg1
│   │   ├── __init__.py
│   │   └── skeleton.py
│   └── namespace2
│   ├── __init__.py
│   └── pkg1
│   ├── __init__.py
│   └── skeleton.py
└── tests
├── conftest.py
└── test_skeleton.py
So I'm conflicted; I trust the team at pyscaffold, but it goes against the example from the packaging authority.
Are both approaches valid?
Is there a reason to choose one approach over the other?
The idea behind the namespace option in PyScaffold is to share/reuse namespaces across projects (in opposite of having more than one namespace inside a single project). Or in other words, to split a larger project in independently maintained/developed projects.
To my best understanding, having an structure like the one you showed in the 4th code block will not work. Using putup --force twice with 2 different namespaces for the same root folder is not the intended/supported usage.
The approach of PyScaffold is the same as the package authority, the only difference is that PyScaffold will assume you have only one package contained in a single project and git repository (PyScaffold also uses a src directory for the reasons explained in Ionel's blog post)
The reason behind adopting one setup.py per namespace+package is that it is required for building separated distribution files (i.e. you need one setup.py per *.whl).

What is the correct way to distribute "bin" and "tests" directories for a Python package?

I have created a python package.
At the advice of several internet sources (including https://github.com/pypa/sampleproject ), I have set up the directory structure like so:
root_dir
├── bin
│ └── do_stuff.py
├── MANIFEST.in
├── README.md
├── my_lib
│ ├── __init__.py
│ ├── __main__.py
│ └── my_lib.py
├── setup.cfg
├── setup.py
├── important_script.py
└── tests
├── __init__.py
└── test_lib.py
I have included tests, bin, and important_script.py in the manifest, and set include_package_data in setup.py to True.
However, after running pip install root_dir, I see that it correctly installed my_lib but bin and tests were just placed directly into Lib/site-packages as if they were separate packages.
I can't find important_script.py at all, and I don't think it was installed.
How do I correctly include these files/directories in my installation?
EDIT
So, it turns out that the bin and tests directories being placed directly into the site-packages directory was caused by something I was doing previously, but I can't discover what. At some point a build and a dist directory were generated in my root_dir (I assume by pip or setuptools?), and any changes I made to the project after that were not actually showing up in the installed package. After deleting these directories, I am no longer able to reproduce that issue.
The sample project distributes neither bin nor tests, it even explicitly excludes tests.
To include bin you should use scripts or entry_points (like in the sample project). Add this to your setup.py to setup() call:
scripts=['bin/do_stuff.py'],
To include tests you should restructure your tree to include the directory tests under the package directory:
root_dir
├── bin
│ └── do_stuff.py
├── MANIFEST.in
├── README.md
├── my_lib
│ ├── __init__.py
│ ├── __main__.py
│ └── my_lib.py
│ └── tests
│ ├── __init__.py
│ └── test_lib.py
├── setup.cfg
├── setup.py
├── important_script.py

No module named utils error on compiling py file

I'm trying to run a .py file through the command prompt using the command "python filename.py". I've already set the environment variables for python after I installed it, so I don't get any error when I type python. The file I'm running imports a few directories, all of which are preexistent in the same directory as the file I'm running, apart from the file web.py, which I can't seem to locate in the directory, so I'm assuming it's somewhere inside the python package, I have downloaded. But, I couldn't find it there either, so would I need to install an extension for python for the web.py file to be successfully imported or is there another way around this.
I've downloaded Python 3.4, I'm using windows 7 as my operating system and the exact error I receive when I try to compile the file is
ImportError: No module named 'utils'
Can someone please explain or direct me to a page which shows in detail how to install extensions for python?
The specific error happens when the Python interpreter can't find a particular ".py" file. In your case, it is the file "utils.py".
First you need to find which file is trying to import "utils.py". Starting with your main file, look up all the files you are importing. (I am guessing this issue is coming from one of the non-library files, but I could be wrong.)
Once you have the "top level" import list, check each of those files to see what THEY are importing, and repeat the process for them. Eventually, you will find the .py file which is trying to import "utils". There might be a directory specification forcing Python to look in the wrong place.
Finally, using windows' file manager, perform a search for "utils.py". As a temporary fix, you can copy it from its current location into your working directory. That will at least allow you to get your project up and running until you sort out the real cause.
This error occurs due to file(s)/folder(s) that are not in their respective locations.
I had a very similar error with a Python Flask Framework app, it turns out that my manage.py and config.py files were inside the app folder with the other folders(they were supposed to be outside the app directory), and that cause the error in my situation.
Once I placed the files in their proper location boom error was gone.
So Check you application framework and make sure things are located were they're supposed to be.
Good luck
I installed via apt (I use debian linux) and had this same error in one project. For me, the solution was to install via pip:
$ pip install utils
It should work for both python 2 and python 3.
So in my case I ran tree command in my Pipenv environment and it should be look like as below: I hope this helps.
.
├── README.md
├── __init__.py
├── core.yaml
├── core_blueprints
│ ├── __init__.py
│ ├── ami_lookup.py
│ ├── chef_buckets.py
│ ├── custom_resources
│ │ ├── __init__.py
│ │ └── cfn_custom_classes.py
│ ├── cw_alarm.py
│ ├── roles.py
│ ├── security_groups.py
│ ├── shared_iam
│ │ ├── __init__.py
│ │ └── iam_policies.py
│ ├── sns_subscription.py
│ ├── sns_topic.py
│ ├── ssm_chefrun_documents.py
│ ├── tf_state.py
│ ├── utils . #### This is not correct location.
│ │ ├── __init__.py
│ │ ├── standalone_output.py
│ │ ├── version.py
│ │ └── version_check.py
│ ├── vpc.py
│ ├── vpn_eip.py
│ └── vpn_server.py
├── core_hooks
│ ├── __init__.py
│ ├── cookbook_archive.py
│ ├── core_lambda.py
│ ├── keypair.py
│ ├── s3.py
│ ├── s3_cache.py
│ └── ssm.py
├── platform_version.py
├── prd1-ca-central-1.env
├── setup.py
└── utils ###### This is a correct location.
├── __init__.py
├── standalone_output.py
├── version.py
└── version_check.py

Categories