Structure Python application with client and server parts - python

I have an application that is currently in the following folder structure:
myapp/
client/
core/
server/
template_files/
It has a server side and a client side component and when I deploy the code to a user, I don't want to include the server side code as well. Both client and server need the core code to run.
Reading about general Python project structure, I realise I should start by changing my structure to:
myapp/
myapp/
client/
core/
server/
template_files/ (template is only needed by the server)
bin/
setup.py
What's the best way to structure my directories and do code deployment?

You might want to have three separate packages instead of one, even your second structure will result in a package (python egg) that contains all the sub-modules which will result in the server code being packaged with the client code.
What you want is to split them up into three individual packages (i.e. myapp.client, myapp.core and myapp.server) and they will have separated directory structures, so in effect you would have something like
myapp.client/
myapp/
client/
setup.py
myapp.core/
myapp/
core/
setup.py
myapp.server/
myapp/
server/
template_files/
setup.py
As they will all become proper python packages, you can define dependencies in the setup.py for myapp.client and myapp.server to require myapp.core, so if/when you deploy the packages onto pypi (or others) your users can simply do pip install myapp.client to get the client library installed onto their system with all dependencies fetched.
You don't necessarily have to have a bin in anywhere. You can take advantage of the entry_points attribute in the setup function to let setuptools create the "binary" for you in an OS agnostic manner. Just define the main function(s) inside your library and let setuptools create the executable for you for your users.
Lastly, you might want to take a look at what other open source projects have done for packaging their libraries, here are some examples:
Flask
Jinja2
plone.app.discussion

Related

Create python distribution for multiple subpackages

I want to create multiple sub-packages and distribute them individually. The obvious solution is to create a multiple setup.py for each sub-package and create a script that can frontend the creation and distribution of those packages. However, I am not sure if this is a good idea and if its the right way to do it. Any recommendations? I looked into namespace packages but thats not exactly what I need, my goal is it ship slim packages with minimal dependencies and thats why I am breaking my project into multiple subpackages that can be distributed
Second question so currently I have the following structure
a/b/c
__init__.py
setup.py
1.py
2.py
subpackage/
__init__.py
3.py
setup.py
Now when I create a sdist using
python a/b/c/subpackage/setup.py sdist bdist_wheel
it adds all the files from top level directory too e.g 1.py 2.py but I would like to only distribute the files from subpackage dir. How can I force it only add files from subpackage dir?

Sharing local python scripts between packages

I've reviewed the information here: How can I make setuptools (or distribute) install a package from the local file system
It looks like that question was posted a very long time ago and I'm hoping that in the seven years since it was posted that there have been some new developments to managing dependencies in the Python world.
Specifically, in our case we are working on a repository of related GCP packages:
src/
airflow/
__init__.py
dags/
__init__.py
requirements.txt
dag1.py
libs/
__init__.py
utils.py
tests/
dags/
test_dag1.py
plugins/
dataflow/
__init__.py
setup.py
dataflow1.py
libs/
__init__.py
utils.py
cli_helper/
__init__.py
cli_command.py
libs/
__init__.py
util.py
shared_utils/
util1.py
I have found myself repeating the same bits of helper functions within the context of each package and would like to put those helper functions in one place and then have a linked copy of the shared_utils files either in a shared_utils folder under each package or even to just have a copy of util1.py placed under the existing libs directory for each package.
What is most "pythonic" way to accomplish this?
Right now it seems that my only options would be to:
Use requirements.txt as listed above where I can and use a custom command in setup.py where requirements.txt can't be used.
Create a os level link from the shared_utils directory into each package such that it appears that the directory exists natively in each of my packages.
Package my shared_utils and then install directly from git. Though this option again requires requirements.txt and in some of my deployment environments, I can't rely on requirements.txt, I have to run everything through setup.py
If you can't use requirements.txt, maybe you can use setuptool's install_requires and go the shared_utils package root. Would that solve your issue?

How to structure a python project with three applications that use common module

My project contains three Python applications. Application 1 is a web app. Applications 2 and 3 contain scripts downloading some data.
All three apps need to use a module Common containing a "model" (classes that are saved to database) and common settings.
I have no clue how to structure this project. I could create three directories, one for each application, and copy Common three times into their directories (doesn't seem right).
Another idea that comes to mind is; create a main directory and put there all files from Common, including __init__.py. Then, crete three subdirectories (submodules), one for each application.
Another way would be installing Common using pip, but that means I would have to reinstall every time I change something in that module.
In previous projects I used .NET - the equivalent in that world would be a Solution with four projects, one of them being Common.
Any suggestions?
I have a similar project that is set up like this
project_root/
App1/
__init__.py
FlaskControlPanel/
app.py
static/
templates/
models/
__init__.py
mymodels.py
Then, I run everything from project_root. I have a small script (either batch or shell depending on my environment) that sets PYTHONPATH=. so that imports work correctly. This is done because I usually develop using PyCharm, where the imports "just work", but when I deploy the final product the path doesn't match what it did in my IDE.
Once the PYTHONPATH is set to include everything from your project root, you can do standard imports.
For example, from my FlaskControlPanel app.py, I have this line:
from models.mymodels import Model1, Model2, Model3
From the App1 __init__.py I have the exact same import statement:
from models.mymodels import Model1, Model2, Model3
I can start the Flask application by running this from my command line (in Windows) while I am in the project_root directory:
setlocal
SET PYTHONPATH=.
python FlaskControlPanel\app.py
The setlocal is used to ensure the PYTHONPATH is only modified for this session.
I like this approach
projects/
__init__.py
project1/
__init__.py
project2/
__init__.py
lib1/
__init__.py
libfile.py
lib2/
__init__.py
So, I need to cd into the projects folder.
To start a projects use
python -m project_name
This allows me to easily import from any external lib like
from lib1.libfile import [imoprt what you want]
or
from lib1 import libfile
Make standard Python modules from your apps. I recommend structure like this:
apps/
common/
setup.py
common/
__init__.py
models.py
app1/
setup.py
app1/
__init__.py
models.py
project/
requirements.txt
Basic setup.py for app common:
#!/usr/bin/env python
from setuptools import setup, find_packages
setup(
name='common',
version='1.0.0',
packages=find_packages(),
zip_safe=False,
)
Make similar setup.py for other apps.
Set editable "-e" option for your apps in requirements.txt:
-e apps/common
-e apps/app1
Install requirements with pip:
$ pip install -r requirements.txt
Editable option means that source files will be linked into Python enviroment. Any change in source files of your apps will have immediate effect without reinstalling them.
Now you can import models from your common app (or any other app) anywhere (in other apps, project files, ...).
I would create a structure like this:
project_root/
app1/
__init__.py
script.py
common/
__init__.py
models.py (all "common" models)
app1/script.py
import os, sys
# add parent directory to pythonpath
basepath = os.path.join(os.path.dirname(os.path.abspath(__file__)), '..')
if basepath not in sys.path:
sys.path.append(basepath)
from common.models VeryCommonModel
print VeryCommonModel
If you don't want to set the python path at runtime, set the python path before running the script:
$ export PYTHONPATH=$PYTHONPATH:/path/to/project_root
And then you can do:
python app1/script.py

Accessing names defined in a package's `__init__.py` when running setuptools tests

I've taken to putting module code directly in a packages __init__.py, even for simple packages where this ends up being the only file.
So I have a bunch of packages that look like this (though they're not all called pants:)
+ pants/
\-- __init__.py
\-- setup.py
\-- README.txt
\--+ test/
\-- __init__.py
I started doing this because it allows me to put the code in a separate (and, critically, separately versionable) directory, and have it work in the same way as it would if the package were located in a single module.py. I keep these in my dev python lib directory, which I have added into $PYTHONPATH when working on such things. Each package is a separate git repo.
edit...
Compared to the typical Python package layout, as exemplified in Radomir's answer, this setup saves me from having to add each package's directory into my PYTHONPATH.
.../edit
This has worked out pretty well, but I've hit upon this (somewhat obscure) issue:
When running tests from within the package directory, the package itself, i.e. code in __init__.py, is not guaranteed to be on the sys.path. This is not a problem under my typical environment, but if someone downloads pants-4.6.tgz and extracts a tarball of the source distribution, cds into the directory, and runs python setup.py test, the package pants itself won't normally be in their sys.path.
I find this strange, because I would expect setuptools to run the tests from a parent directory of the package under test. However, for whatever reason, it doesn't do that, I guess because normally you wouldn't package things this way.
Relative imports don't work because test is a top-level package, having been found as a subdirectory of the current-directory component of sys.path.
I'd like to avoid having to move the code into a separate file and importing its public names into __init__.py. Mostly because that seems like pointless clutter for a simple module.
I could explicitly add the parent directory to sys.path from within setup.py, but would prefer not to. For one thing, this could, at least in theory, fail, e.g. if somebody decides to run the test from the root of their filesystem (presumably a Windows drive). But mostly it just feels jerry-rigged.
Is there a better way?
Is it considered particularly bad form to put code in __init__.py?
I think the standard way to package python programs would be more like this:
\-- setup.py
\-- README.txt
\--+ pants/
\-- __init__.py
\-- __main__.py
...
\--+ tests/
\-- __init__.py
...
\--+ some_dependency_you_need/
...
Then you avoid the problem.

Python project organization (specially for external libs)

I plan to organize my python project the following way:
<my_project>/
webapp/
mymodulea.py
mymoduleb.py
mymodulec.py
mylargemodule/
__init.py__
mysubmodule1.py
mysubmodule2.py
backend/
mybackend1.py
mybackend2.py
lib/
python_external_lib1.py
python_external_large_lib2/
__init__.py
blabla.py
python_external_lib2.py
in my development IDE (PYdev) to have all working I have setup webapp/, backend/ and lib/ as source folders and all of course works.
How can I deploy it on a remote server? Have I to set PYTHONPATH in a startupscript ?Or have I to it programmatively?
If you are treating webapp, backend, and lib as source folders, then you are importing (for example) mymodulea, mybackend1, and python_external_large_lib2.
Then on the server, you must put webapp, backend, and lib into your python path. Doing it in some kind of startup script is the usual way to do it. Doing it programmatically is complicated because now your code needs to know what environment it's running in to configure the path correctly.

Categories