Django add urls from project to another project - python

I have an existing project (let's name it main) on Django and several applications in it. There is a separate project, also in django, and one application inside it (we will call this the second one). Here is a generalized file structure for project "second":
my_second_project
│ manage.py
│ models.py
│ my_models.py
│ my_views.py
│
├───myapp
│ │ admin.py
│ │ apps.py
│ │ Funcs.py
│ │ models.py
│ │ tests.py
│ │ urls.py <-- from here urls import to project urls file
│ │ views.py
│ │ __init__.py
│ │
│ ├───migrations
│ │ └───...│
├───my_second_project
│ │ asgi.py
│ │ settings.py
│ │ urls.py <-- HERE all urls i need
│ │ wsgi.py
│ │ __init__.py
├───templates
│ ...
│
└───__pycache__
models.cpython-37.pyc
Here is a generalized file structure for project "main":
main_project
├───app ...
│ ├───...
├───main_project
│ ├───media
│ │ └───user_uploads
│ ├───settings
│ │ └───base.py
│ └───urls.py
├───app ...
│ ├───...
├───app ...
│ ├───...
└───static
├...
I need to integrate "second" project into my existing one (main project), ideally without making any changes to the second project. I tried to do it in the same way that applications are integrated (via urls include), but it seems that it does not work with projects because django writes "myapp module not found".
url('data-classifier/', include('my_second_project.my_second_project.urls'))
Is there some way to add a "second" project to my "main" project without changing the "second" one?

When you deploy these projects they won't be stored in directories nearby. Ideally they won't be on the same server at all.
Instead if you can't afford to copy (or move) contents of the app you need from second to main project, and you don't want to redirect with nginx, make a small app in your main project and from urls.py redirect to endpoints of second.
main_project.my_second_project.urls.py
from django.urls import path
from django.views.generic import RedirectView
app_name = 'my_second_project'
urlpatterns = [
path('endpoint/', RedirectView.as_view(url='<my_second_project_url>'), name='endpoint')
]
If you're running main locally at 8000, and second at 8001, then you'd put 'http://localhost:8001/endpoint/' as url up there.

Related

How to properly manage external dependencies in prefect flow?

I would like to implement one central prefect project, where over time it will be possible to add flows independent of each other. The structure of the project is something like this:
prefect/
├── src/
│ ├── flows/
│ │ ├── test_pack1/
│ │ │ ├── common/
│ │ │ │ ├── __init__.py
│ │ │ │ └── test_module.py
│ │ │ ├── .env
│ │ │ ├── __init__.py
│ │ │ ├── requirements.txt
│ │ │ └── test_pack1_flow.py
│ │ ├── test_pack2/
│ │ │ ├── __init__.py
│ │ │ ├── .env
│ │ │ ├── requirements.txt
│ │ │ └── test_pack2_flow.py
│ │ ├── __init__.py
│ │ └── Dockerfile
│ ├── utilities/
│ │ ├── __init__.py
│ │ ├── storage.py
│ │ ├── builder.py
│ │ ├── executor.py
│ │ └── run_config.py
│ ├── .env
│ ├── __init__.py
│ └── main.py
├── .gitignore
├── poetry.lock
└── pyproject.toml
I would like each flow in the flows/ folder to be independent of the central project and created as a separate docker container.
builder.py at startup searches for all flows in flows/ folder, sets a specific configuration and registers them on the server.
But I ran into the problem of importing third-party packages. Let's say in the test_package1/ in requirements.txt there is SQLAlchemy==1.4.34. And in test_pack1/common/test_module.py there is an import sqlalchemy. And test_pack1/test_pack1_flow.py have a #task with function from test_module.py. When the FlowBuilder class looks for a flow variable in the file test_pack1_flow.py it does this using the function flow = extract_flow_from_file(str(flow_module)). At this step, a ModuleNotFoundError error occurs, since there is no such dependency in the prefect central application(in pyproject.toml). But when the docker container is created, after flow.register(), of course it will already be there. How can I handle this step? Or maybe I'm doing something wrong?
I use Docker Storage, Docker Run and Local Executor.
This is a matter of packaging flow code dependencies, and it's all definitely doable. Since this was cross-posted on Prefect Discourse here, I responded in much more detail there.
Here is a short summary:
You can use Prefect Register CLI instead of building custom builder.py functionality looping over flows
You can have a custom utility function setting different storage and run_config based on your environment (dev/stage/prod etc)
To solve the problem of dependencies being in a Docker image but not in your local environment, you can solve it with a custom package defined with setup.py

Failed to load docs in Fastapi

When I try to use the default docs in FastApi never works. If I use /docs it gives an error saying:
"Failed to load API definition. Error Hide Fetch Internal Server Error /openapi.json"
and if I use /redoc it loads forever and never appears anything. I searched a lot but the only thing I found out was someone using root_pah="/folder/" inside of the FastAPI() main instance, so I used the app folder which is the repository of the whole api project. I don't know if it will help you guys but the tree of the project is:
C:.
├───.pytest_cache
│ └───v
│ └───cache
├───alembic
├───app
│ ├───models
│ │ └───__pycache__
│ ├───routers
│ │ └───__pycache__
│ ├───tests
│ │ ├───.pytest_cache
│ │ │ └───v
│ │ │ └───cache
│ │ └───__pycache__
│ └───__pycache__
├───venv
│ ├───Include
│ │
│ └───Scripts
└───__pycache__
p.s. I wiped most of the part of the venv directory because I thought that it was useless for this and also when I tried to copy and paste the whole output in here, Stack Overflow would accept just part of it because it was too big.

How do I target the Django setting module in an environmental variable on PythonAnywhere?

I want to run my stand-alone script csvImp.py, which interacts with the database used by my Django site BilliClub. I'm running the script from the project root (~/BilliClub) on my virtual environment django2.
I've followed the instructions here but for DJANGO_SETTINGS_MODULE rather than the secret key. The part that trips me up is what value to assign to the environmental variable. Every iteration I've tried has yielded an error like ModuleNotFoundError: No module named 'BilliClub' after running
(django2) 04:02 ~/BilliClub $ python ./pigs/csvImp.py.
I am reloading the shell every time I try to change the variable in .env so the postactivate script runs each time and I'm making sure to re-enter my virtualenv. The problem just seems to be my inability to figure out how to path to the settings module.
The .env file:
# /home/username/BilliClub/.env #
DJANGO_SETTINGS_MODULE="[what goes here???]"
Full path of my settings.py is /home/username/BilliClub/BilliClub/settings.py.
Abridged results from running tree:
(django2) 04:33 ~ $ tree
.
├── BilliClub
│ ├── BilliClub
│ │ ├── __init__.py
│ │ ├── settings.py
│ │ ├── urls.py
│ │ └── wsgi.py
│ ├── manage.py
│ ├── media
│ ├── pigs
│ │ ├── __init__.py
│ │ ├── admin.py
│ │ ├── apps.py
│ │ ├── bc2019.csv
│ │ ├── csvImp.py
│ │ ├── models.py
│ │ ├── models.pyc
│ │ ├── tests.py
│ │ ├── urls.py
│ │ └── views.py
│ └── ...
It looks like you should make csvImp a custom management command and then
DJANGO_SETTINGS_MODULE is "BilliClub.settings" When you write your utility function as Django Management Command you get all the Django configuration for free, and the root directory of your command is the same as the root directory of the web app, and its the directory where manage.py is.
Take a look at https://docs.djangoproject.com/en/3.1/howto/custom-management-commands/

How to import models frome one Django app to another?

I tried to import models from one app to another by using the following:
from ..appName.models import className
also
from appName.models import className
but attempted relative import beyond top-level package error occured. My aim is to create a student management system. In one app i created the model and in another app i try to retrive data from that model but failed to do it..
├───Save_to_database
│ ├───HelpingClass
│ │ └───__pycache__
│ ├───migrations
│ │ └───__pycache__
│ └───__pycache__
├───Search
│ ├───migrations
│ │ └───__pycache__
│ └───__pycache__
├───StudentApp
│ └───__pycache__
└───templates
I want to import models from save_to_database into Search.
And yes i added the app in settings.py
Excuse my english
├───Save_to_database
│ ├───HelpingClass
│ │ └───__pycache__
│ ├───migrations
│ │ └───__pycache__
│ └───__pycache__
├───Search
│ ├───migrations
│ │ └───__pycache__
│ └───__pycache__
├───StudentApp
│ └───__pycache__
└───templates
I want to import models from save_to_database into Search.
And yes i added the app in settings.py

django multiple views file

I separate my views into some files.
so say I have this:
website/
│ manage.py
│
├───app
│ │ admin.py
│ │ admin.pyc
│ │ models.py
│ │ models.pyc
│ │ tests.py
│ │ views.py
│ │ views.pyc
│ │ views_home.py
│ │ views_home.pyc
│ │ __init__.py
│ │ __init__.pyc
│ │
│ └───templates
│
├───locale
│
│
└───website
settings.py
settings.pyc
urls.py
urls.pyc
wsgi.py
wsgi.pyc
__init__.py
__init__.pyc
In my urls.py I import each views file. I write:
url(r'^home/$', views_home.home),
One thing to mention, I also import all the views from views.py to other views files, means in views_home.py I do
from views import *
Because I want some functions which are in my views.py to be available.
I have a lot of imports in the views file, some of them are django different HttpResponse type objects (Http404, HttpResponse, render etc.)
+ more external libraries.
The point is- if I will turn my views into a modules - I will create a views folder with __init__.py file and store my views file in this folder. How can I avoid import the same objects which are needed in every views file (like HttpResponse), in each view file in the package?
Can I write an import in __init__.py file that will be imported for all the files in the package? (like say, HttpResponse)
I usually see the __init__.py file is empty, what use does it have beside telling python that the folder is a package?
You cannot - and should not try to - "avoid" this; it is a fundamental principle in Python that all names used in a module are defined there or imported explicitly.

Categories