New to python packaging (but not python) so please excuse if questions are all over the place.
Trying to package a python module ( project-A ) which uses bunch of common scripts from another directory ( lib).
Hitting a wall on how to include this lib directory in the final package artifact.
The setup.py , working directory structure and how i plan the final package installation structure are shown below .
Few more questions -
The relative location of lib changes between my work-directory and installed package . Is there a way to import in both cases without changing the code ?
Is this structure better or should i move everything related to project_A under its own directory , that way each project will have its own setup.py, MANIFEST, conf directory, etc.. ( location of lib would remain the same, not sure if each package should have its own tests directory )
release=subprocess.check_output fails if someone runs this outside a git repo. Is there a way to add a default value in such cases ?
An example i came across had lib moved to
/usr/lib/python3.4/site-packages/project_A/_lib . Liked this approach. any idea how to achieve this in setup.py
setup.py ( currently creates two packages )
setup(
name='project_A',
version='0.15',
# release is not supported in bdist rpm
#release=subprocess.check_output(["git", "rev-list", "--count", "--first-parent", "HEAD"]).rstrip(),
# if creating outside git
release="0.0.1",
author='foo',
author_email='hello#world.com',
url='http://www.hello.com',
long_description="README.txt",
#install_requires=['bottle','requests','supervisor'], # currently not working
#dependency_links = ['https://pypi.python.org/packages/source/b/bottle/bottle-0.12.8.tar.gz'],
packages=['project_A'],
include_package_data=True,
package_data={'images' : ['hello.gif']},
data_files=[
#('/etc/init.d/', ['project_Actl']), # some startup script
('/var/log/project_A',[]),
('/etc/project_A/conf/',['conf/project_A.conf'])
],
description="Hello World testing setuptools",
tests_require=['pytest'],
cmdclass = {
'test': PyTest,
'clean': CleanCommand
}
)
Project-A installation directory -
/usr/lib/python3.4/site-packages/project_A
|
|____project_A/
| |
| |____project_A.py
|
|____lib
| |
| |______init__.py
| |____parseArguments.py
| |____setupLogger.py
| |____cleanup.py
Python workspace directory -
.
|____setup.py
|
|____MANIFEST.in
|
|____README.md
|
|____project_A/
| |
| |____project_A.py
|
|____project_B/
| |
| |____project_B.py
|
|____conf/
| |
| |____project_A.conf
| |____project_B.conf
|
|____lib
| |
| |______init__.py
| |____parseArguments.py
| |____setupLogger.py
| |____cleanup.py
|
|
|____images/
| |____hello.gif
|
|____tests
| |
| |____project_A
| | |____test_B_example.py
|
| |____project_B
| | |____test_A_example.py
Related
I have a pretty standart Django project and i can't find a way to import /middleware/utils/compare.py from /middleware/utils/request.py
This is the proyect tree:
|--middleware/
| |--utils/
| | |--compare.py
| | |--request.py
| |--__init__.py
| |--asgi.py
| |--settings.py
| |--urls.py
| |--views.py
| |--wsgi.py
|--manage.py
Where __init__.py, asgi.py, settings.py, urls.py, wsgi.py have no major modifications. (__init__.py is empty)
# middleware/views.py
from .utils.request import requests # This line works fine
# middleware/utils/request.py
from compare import compare # ModuleNotFoundError: No module named 'compare'
from .compare import compare # Attempted relative import beyond top-level package pylint(relative-beyond-top-level)
# ^^^ This line absolutly breaks my code, but it runs
from utils.compare import compare # ModuleNotFoundError: No module named 'utils'
from .utils.compare import compare # ModuleNotFoundError: No module named 'middleware.utils.utils'
Note: compare.py has a function named compare, i also tried renaming it but had the same problems.
This might be obvious, but i run the proyect with python manage.py runserver
Simply add empty __init__.py in utils folder.
And read more about it here:
https://docs.python.org/3/tutorial/modules.html#packages
Change your file tree like this,
|--middleware/
| |--utils/
| |--|__init__.py # Added this file
| | |--compare.py
| | |--request.py
| |--__init__.py
| |--asgi.py
| |--settings.py
| |--urls.py
| |--views.py
| |--wsgi.py
|--manage.py
And import like this,
from utils.compare import compare
am a beginner in python and wants to test some code in webjob and function app. usually i write code using c# so in visual studio we have templates to create webjob/function app so that we get all required files and init code. Now using python i need the required file structure and init code.
The folder/file structure for a Python Functions project looks like:
<project_root>/
| - .venv/
| - .vscode/
| - my_first_function/
| | - __init__.py
| | - function.json
| | - example.py
| - my_second_function/
| | - __init__.py
| | - function.json
| - shared_code/
| | - __init__.py
| | - my_first_helper_function.py
| | - my_second_helper_function.py
| - tests/
| | - test_my_second_function.py
| - .funcignore
| - host.json
| - local.settings.json
| - requirements.txt
| - Dockerfile
init method for webjob/function using python
init__.py
import azure.functions as func
import logging
def main(req: func.HttpRequest,
obj: func.InputStream):
logging.info(f'Python HTTP triggered function processed: {obj.read()}')
Please follow developer guide of Azure functions using python and Python Azure Functions using VS Code.
I have a project with the following hierarchy:
create_fresh_databases
|
|--src
| |
| -- configurations
| |
| |-- config.py
| |-- config.xml
| -- __init__.py
|
---- create_fresh_databases.py
the config.py file contains Configurations class
the file create_fresh_databases.py looks:
from create_fresh_databases.src.configurations.config import Configurations
def main():
con = Configurations("conf.xml")
if __name__ == "__main__":
main()
but I'm getting error:
ModuleNotFoundError: No module named 'create_fresh_databases.src'; create_fresh_databases is not a package
My IDE is pycharm, and I have python 3.6
How can I fix it ?
I think you should organise the package structure like this:
Package:
|
|--Configurations:
| |
| |--Config.py
| |
| |--Config1.xml
|
|--__init__.py
Then you can import it as Package.configurations.config
I have found a lot of examples explaining how we can use uwsgi and emperor mode to achieve multiple apps deployment. This means for me: several apps folder with one vassal (ini, socket, application.py) per app.
I was not yet able to find examples of configurations where there were only one app folder with multiple vassals. This should allow me to serve a multi-tenants app (each customer has its own database). I tested this with two instances. This "seems" to work well.
Is this a good practice or do you have similar setup ?
Does this provide complete isolation between vassals instances ?
Here is my setup. I am using nginx/uwsgi/python stack (I make use of emperor_pg module). This setup allows uwsgi to spawn one vassal per customer A and B. Customers are using url : customerN.mydomain.com/fe1/web
Nginx config :
# This virtual host catches all incoming traffic from port 80 (security should be considered if not talking on local
# network)
server {
listen 80;
# We capture here the subdomain. It is used to designate a customer entity.
server_name ~^(?<subdomain>.+)\.mydomain\.fr$;
# We use a pattern for creating sockets name and path.
# This allows to spawn vassals automatically by detecting changes in the vassals pg table (emperor_pg)
# Pattern used is : /tmp/$subdomain$appname.sock
location ~favicon\.ico$ {
root /opt/app-current/web/;
}
location ~^\/(?<app_name>.+)\/web\/ {
root /opt/web-content/$subdomain/;
}
location ~^\/(?<app_name>.+)\/ {
# Routing to socket designated by standard pattern
include uwsgi_params;
uwsgi_pass unix://tmp/$subdomain$app_name.sock;
}
# When calling root of entity's subdomain, we launch the default app by routing traffic to index.socket
location / {
include uwsgi_params;
uwsgi_pass unix://tmp/$subdomain.sock;
}
}
The emperor upstart script using emperor_pg :
# Emperor configuration upstart script in `/etc/init/uwsgi.conf` :
# uWSGI - Manage uWSGI Application Server
description "uWSGI Emperor Mode"
start on (filesystem and net-device-up IFACE=lo)
stop on runlevel [!2345]
respawn
# We use pg mode. This allows to scan a postgresql database.
# requires sudo apt-get install uwsgi-plugin-emperor-pg
exec /usr/bin/uwsgi --uid www-data --gid www-data --plugin emperor_pg --emperor "pg://host=dbserver.com.com user=saasautomator dbname=saasautomator;SELECT name,config,ts,uid,gid,socket FROM vassals" --logto /var/log/uwsgi.log
And an example of vassals conf files in database. They are using the same application folder :
saasautomator=> SELECT * FROM vassals;
idvassals | name | config | ts | uid | gid | socket
-----------+---------------------------------+----------------------------------------------------------------+---------------------+-------+-------+---------------------------------------
3 | customerAfe1.ini | [uwsgi] +| 2004-10-19 10:23:54 | uwsgi | uwsgi | /tmp/customerAfe1.sock
| | master = true +| | | |
| | vaccum = true +| | | |
| | chdir = /opt/app/ +| | | |
| | plugins = python +| | | |
| | wsgi-file = /opt/app/fe1/application.py +| | | |
| | processes = 4 +| | | |
| | threads = 2 +| | | |
| | stats = 127.0.0.1:9191 | | | |
4 | customerBfe1.ini | [uwsgi] +| 2004-10-19 10:23:55 | uwsgi | uwsgi | /tmp/customerBfe1.sock
| | master = true +| | | |
| | vaccum = true +| | | |
| | chdir = /opt/app/ +| | | |
| | plugins = python +| | | |
| | wsgi-file = /opt/app/fe1/application.py +| | | |
| | processes = 4 +| | | |
| | threads = 2 +| | | |
| | stats = 127.0.0.1:9192 | | | |
(2 rows)
Thanks !
"Is this a good practice or do you have similar setup ?"
I do have similar setup. I won't answer if it's good practice, because it's opinion based, but there are small differences between this and copying code base for every customer. Main difference is: if someone will break into one app and he will be able to modify it's code, it will affect all apps... But if there are no differences in configuration of apps that will allow breaking in such way one, but not other - it won't matter. If someone breaks into one, he can reproduce that on others...
Also fixing changes that were made after hacking your app will be easier with shared codebase.
I can't see any other difference that will matter...
"Does this provide complete isolation between vassals instances ?"
Not complete, as mentioned before, but anything else can be isolated. This is like sharing one library between applications in system, which happens all the time. With proper configuration, you can also separate databases and other resources.
If filesystem permissions are applied correctly, you can limit ability to change code of your application by itself, so it will limit ability to change code by some hacker.
I want to execute the seeds.py script located in restaurants_project/scripts/. This seed file would populate my database with restaurant records based on the CSV data within the same folder by using my Django model Restaurant. I've found that the only way to include the project settings (i.e. restaurant_project.settings) along with my necessary models was to sys.path.append('MY_PROJECTS_RELATIVE_PATH') in the seeds file. I obviously don't want to do this in production (or do i?) so what's the best way to include those settings without getting the following ImportError:
ImportError: Could not import settings 'restaurants_project.settings' (Is it on sys.path? Is there an import error in the settings file?): No module named 'restaurants_project'
Directory Tree:
|___restaurants_project
| |______init__.py
| |______pycache__
| | |______init__.cpython-34.pyc
| | |____settings.cpython-34.pyc
| |____settings.py
| |____urls.py
| |____wsgi.py
|____restaurants
| |______init__.py
| |______pycache__
| | |______init__.cpython-34.pyc
| | |____admin.cpython-34.pyc
| | |____models.cpython-34.pyc
| |____admin.py
| |____migrations
| | |______init__.py
| |____models.py
| |____tests.py
| |____views.py
|____scripts
| |______init__.py
| |______pycache__
| | |______init__.cpython-34.pyc
| |____restaurant_inspections_2014.csv
| |____seeds.py
| |____unique_restaurant_ids.txt
seeds.py:
import os, sys
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'restaurants_project.settings')
import django
django.setup()
import csv
csvfile = open('restaurant_inspections_2014.csv')
reader = csv.reader(csvfile)
headers = next(reader, None)
for row in reader:
print(row)
import pdb; pdb.set_trace()
The best way to write a script that uses the settings of a project is to write a custom management command:
https://docs.djangoproject.com/en/1.8/howto/custom-management-commands/
Basically you should wrap your script inside a class that extends BaseCommand and put it inside the management/commands directory of an app.