I have an API that uses FastAPI. In a single file (main.py), I have the call to the function that creates the API
from fastapi import FastAPI
# ...
app = FastAPI()
As well as all the endpoints:
#app.post("/sum")
async def sum_two_numbers(number1: int, number2: int):
return {'result': number1 + number2}
But as the application gets larger, the file is becoming messy and hard to maintain. The obvious solution would be to keep function definitions in separate modules and just import them and use them in main.py, like this:
from mymodules.operations import sum_two_numbers
# ...
#app.post("/sum")
sum_two_numbers(number1: int, number2: int)
Only that doesn't work. I don't know if I'm doing it wrong or it can't be done, but I get this error from VSCode:
Expected function or class declaration after decorator | Pylance
(My program has so many errors that I haven't seen the actual interpreter complaint, but if that's important, I can try debug it and post it here)
So is this impossible to do and I have to stick to the one-file API, or it is possible to do, but I'm doing it wrong? If the second, what is the right way?
The common solution is to split your application into subrouters. Instead of using app directly when registering your views, you create an instance APIRouter (from fastapi import APIRouter) inside each of your modules, then you register these subrouters into your main application.
Inside a dedicated api module, such as api/pages.py:
from fastapi import APIRouter
router = APIRouter()
#router.get('')
async def get_pages():
return ...
from .api import (
pages,
posts,
users,
)
app.include_router(pages.router, prefix='/pages')
app.include_router(posts.router, prefix='/posts')
app.include_router(users.router, prefix='/users')
Another powerful construct you can use is to have two dedicated base routers, one that requires authentication and one that doesn't:
unauthenticated_router = APIRouter()
authenticated_router = APIRouter(dependencies=[Depends(get_authenticated_user)])
.. and you can then register the different routes under each router, depending on whether you want to guard the route with an authenticated user or not. You'd have two subrouters inside each module, one for endpoints that require authentication and one for those that doesn't, and name them appropriately (and if you have no public endpoints, just use authenticated_router as the single name).
unauthenticated_router.include_router(authentication.router, prefix='/authenticate')
unauthenticated_router.include_router(users.unauthenticated_router, prefix='/users', tags=['users'])
authenticated_router.include_router(users.router, prefix='/users')
Any sub router registered under authenticated_router will have the get_authenticated_user dependency evaluated first, which in this case would throw a 401 error if the user wasn't logged in. You can then authorized further based on roles etc. in the dependencies for the view function - but this makes it very explicit whether you want your endpoint to end up in a chain that requires authentication or not.
So is this impossible to do and I have to stick to the one-file API, or it is possible to do, but I'm doing it wrong?
One file api is just for demo/test purposes, in the real world you always do multi-file api especialy with framework like fastapi where you use different type of file like pydantic models, db models, etc.
If the second, what is the right way?
There is no "right way", there is ways that fit your needs.
You can follow the advenced user guide, to see a good example.
what the doc suggest:
.
├── app
│ ├── __init__.py
│ ├── main.py
│ ├── dependencies.py
│ └── routers
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── internal
│ ├── __init__.py
│ └── admin.py
what i use when i have db, pydantic models, etc
.
├── app
│ ├── __init__.py
│ ├── main.py
│ ├── dependencies.py
│ └── routers
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── models
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── schemas
│ │ ├── __init__.py
│ │ ├── items.py
│ │ └── users.py
│ └── internal
│ │ ├── __init__.py
│ │ └── admin.py
here models represent db models and schemas represent pydantic models
Related
General issue:
I have an abstract model that I want to test with a real model instance, however I don't want to have to completely restructure my project/test format. See these two answers for reference: First answer Second answer
I want to
A) Define models inside each test app folder and not define them inside the actual apps
B) Not have an additional/separate apps.py and settings.py configuration for each test folder.
I know this is possible because the django project has a very similar test structure like this, but they use a test run script that I can't entirely decipher.
The test/ folder mirrors the app structure I have (which are blog/, projects/, richeditable/).
backend
├── projects
├── blog
├── richeditable
├── tests
│ ├── __init__.py
│ ├── conftest.py
│ ├── blog
│ │ ├── ...
│ ├── projects
│ │ ├── ...
│ └── richeditable
│ │ ├── __init__.py
│ │ ├── test_model.py
│ │ ├── models.py <-------------- This defines a model that inherits from richeditable.models
# richeditable/models.py
class Draftable(models.Model):
blah = models.IntegerField()
class Meta:
abstract = True
# tests/richeditable/models.py
class DummyDraftable(Draftable):
additional_field = models.BooleanField()
# class Meta:
# app_label = "some_app"
# ^--- Django does not seem to like having this blank and I don't know what to put to make this work
# tests/richeditable/test_model.py
import pytest
#pytest.fixture
def add_app(settings):
settings.INSTALLED_APPS += ['backend.tests.richeditable']
# presumably the above fixture would affect the apps/models
# settings before the database is created, but that does not seems to be the case
def test_please_work(add_app, db):
assert DummyDraftable.objects.create(blah=1, additional_field=True) is not None
My best guess from what I've seen of the django project script is that they load in each folder for testing as a module and add it to INSTALLED_APPS at run time before the test cases. However, you can't just simply change INSTALLED_APPS because models are being added and migrations have to be made to the test database beforehand as well as there seems to be a need to define an AppConfig (because Django loses it's mind if you don't). I've tried to include the app_label Meta field for models but it didn't work, or I may have done something wrong. But the point is, I don't see the script creating an AppConfig and they somehow don't have to declare Meta in their models.py
Pytest specific stuff:
Things get further complicated with pytest-django because it doesn't use Django's TestRunner interface. This is how you would do it if that were the case (note the order of operations). I have already tried modifying the settings pytest fixture before instantiating the db with the associated fixtures but this doesn't end up loading the module no matter what I do. From looking at the source code, it seems like the settings are fixed in place based on what the settings.py specifies and modifying the settings fixture makes no difference to app loading.
So I ended up solving my own issue. As #hoefling mentioned, the best way is to create a separate settings file that extends your normal settings file and specify that as your pytest settings file.
// pytest.ini
[pytest]
addopts = --ds=config.settings.test
An important thing to note is that you cannot have your test modules be the same name as already existing apps, as I found out. So the structure I had with backend/tests/richeditable is not allowed no matter what you do. So I prepended each folder with test_ and it works fine. This also solves the issue of having to include app_label in your models.
# testsettings.py
from .settings import * # noqa
INSTALLED_APPS += [
'backend.tests.test_projects',
'backend.tests.test_blog',
'backend.tests.test_richeditable',
]
backend
├── projects
├── blog
├── richeditable
├── tests
│ ├── __init__.py
│ ├── conftest.py
│ ├── test_blog
│ │ ├── ...
│ ├── test_projects
│ │ ├── ...
│ └── test_richeditable
│ │ ├── __init__.py
│ │ ├── test_model.py
│ │ ├── models.py
I am building a python package, which has a module called "config", in which I have defined different files that contain some global variables that are used as configuration in other modules.
.
└── mypackage
├── base.py
├── config
│ ├── completion.py
│ ├── __init__.py
│ ├── locals.py
│ └── queries.py
├── encoders.py
├── exceptions.py
├── functions
│ ├── actions.py
│ ├── dummy.py
│ ├── __init__.py
│ ├── parsers.py
│ └── seekers.py
├── __init__.py
├── query.py
└── utils
├── dict.py
├── __init__.py
└── retry.py
For example, the file mypackage/config/queries.py has the following content:
INCLUDE_PARENTS = False
Whereas in the main file mypackage/base.py, I have a function which takes this config variable as a default argument:
import mypackage.config.queries as conf
def query(include_parent_=conf.INCLUDE_PARENTS, **kwargs):
# do stuff depending on include_parent_ argument
What I want, and what I haven't been able to find in other similar questions, is to be able to dynamically modify these variables in a Python/Ipython console session. That is, I should be able to do the following on Ipython:
In [1]: import mypackage as mp
In [2]: mp.config.INCLUDE_PARENTS = True # Its default value was False
In [3]: mp.query()
Out[3]: # result with include_parent_ argument set to True
In [4]: mp.config.INCLUDE_PARENTS = False # Now I set the value back to False
In [5]: mp.query()
Out[5]: # result with include_parent_ argument set to False
But I don't understand why I am not able to achieve it. I have tried importing the configuration variables in init.py with their associated namespace, but I never manage to be able to change the global configuration variables dynamically, as Pandas does, for example.
The issue is that you are using conf.INCLUDE_PARENTS as a default parameter of a function. A default parameter is evaluated when the function is created not when it is called. Thus, when you change your code later, the value inside the function does not change. The following should work as you expect.
def query(include_parent_=None, **kwargs):
if include_parent_ is None:
include_parent_ = conf.INCLUDE_PARENTS
# do stuff depending on include_parent_ argument
I have the following project structure for a Flask app using flask-restx
.
├── app
│ ├── extensions.py
│ ├── __init__.py
│ └── pv_dimensioning
│ ├── controller.py
│ ├── __init__.py
│ ├── models
│ │ ├── dto.py
│ │ ├── __init__.py
│ │ └── vendor_models.py
│ ├── services
│ │ ├── calculator.py
│ │ ├── database.py
│ │ ├── data.py
│ │ ├── db_crud.py
│ │ ├── __init__.py
│ │ └── processor.py
│ └── utils
│ ├── decode_verify_jwt.py
│ ├── decorator.py
│ └── __init__.py
├── config.py
├── main.py
├── package.json
├── package-lock.json
├── Pipfile
├── Pipfile.lock
├── README.md
├── serverless.yml
└── tests
├── __init__.py
├── test_calculator.py
├── test_config.py
└── test_processor.py
In controller.py, I am adding the add_argument() statements and parsing them in the api routes. In one of the add_argument() statement, I would like to add choices for the user. For getting the choices, I am querying from the database and getting a list of values available. I then convert this list to a tuple, assign it to a variable, and pass it as choices parameter in the add_argument()
My codes:
data.py
from ..models.vendor_models import AdminVendor
def data(app):
values = AdminVendor.query.all()
v = [value.name for value in values]
return {'v': tuple(v)}
controller.py
from flask_restx import Resource, reqparse
parser = reqparse.RequestParser()
parser.add_argument(
"vendor",
choices=vendors, # <--- The values of v should be added here
help="Select the vendor"
)
#ns.route("/")
class UserOutput(Resource):
#ns.doc(
"Get calculated response",
responses={
200: "Values returned",
400: "Validation Error",
403: "Not authorized"
},
)
#ns.expect(parser, validation=True)
def get(self):
args = parser.parse_args()
return DimensionCalculator.inputs(**args), 200
where ns is the namespace.
My __init__.py file in the app folder is as follows:
from flask import Flask
from .extensions import cors, db, ma
def create_app(app_config):
app = Flask(__name__)
app.config.from_object(app_config)
register_blueprints(app)
register_extensions(app)
return app
def register_extensions(app):
cors.init_app(app)
db.init_app(app)
ma.init_app(app)
def register_blueprints(app):
from .pv_dimensioning import dimensioning_blueprint
app.register_blueprint(dimensioning_blueprint)
and the entry point to the app is main.py
import os
from app import create_app
from app.extensions import db
from app.pv_dimensioning.services.data import data
from config import config_by_name
config_name = os.getenv("FLASK_CONFIG") or "default"
app_config = config_by_name[config_name]
app = create_app(app_config)
db.create_all(app=app)
with app.app_context():
v = data(app)
print(v)
The output of print(v) is as follows:
{'v': ('Solarmodul Canadian Solar HiKu CS3L-370MS 370Wp', 'Solarmodul Longi LR4-60HIH-370M, 370Wp', 'Solarmodul Solar Fabrik mono S3 - Halfcut 360Wp', 'Solarmodul Energetica e.Classic M HC black - 360Wp', 'Solarmodul Yingli YL280P-29b-YGE 60 Cell Series 2 - poly, 280Wp', 'Solarmodul Suntech Power STP370S-B60/Wnh, 370Wp', 'Solarmodul AXITEC AXIworldpremium X HC AC-340MH/120S, 340Wp', 'Solarmodul Longi LR4-72HIH-440M, 440Wp', 'Solarmodul Seraphim SRP-330-BMB-DG 330Wp', 'Solarmodul Sharp NU-JD 440Wp')}
I want these values of v to be used in controller.py in the 'vendor' argument.
I have tried getting the values of v from main.py by adding from main import v in the controller.py, but it shows the following error
ImportError: cannot import name 'v' from 'main'
What is the mistake that I am doing?
I'm not an expert on flask_restx, but from my understanding, the choices argument takes an iterable so you should simply be able to pass in the return value of your data function.
data.py
from ..models.vendor_models import AdminVendor
def data():
values = AdminVendor.query.all()
v = [value.name for value in values]
return {'v': tuple(v)}
controller.py
from flask_restx import Resource, reqparse
from .services.data import data
parser = reqparse.RequestParser()
parser.add_argument(
"vendor",
choices=data()['v'],
help="Select the vendor")
Regarding the import error, as Mindslave points out that is most likely a circular import error see this question for a bit more detail. Generally these can be avoided by moving the import from the top of the module to within a function/class, e.g:
from flask_restx import Resource, reqparse
def load_parser():
from .services.data import data # avoid circular import
parser = reqparse.RequestParser()
parser.add_argument(
"vendor",
choices=data()['v'],
help="Select the vendor")
return parser
parse = load_parser()
As a side note, be aware that reqparse is scheduled to be removed from flask_restx, so might be worth considering a different option before you get too embedded with it:
Warning The whole request parser part of Flask-RESTX is slated for
removal and will be replaced by documentation on how to integrate with
other packages that do the input/output stuff better (such as
marshmallow). This means that it will be maintained until 2.0 but
consider it deprecated. Don’t worry, if you have code using that now
and wish to continue doing so, it’s not going to go away any time too
soon.
source: https://flask-restx.readthedocs.io/en/latest/parsing.html
I'm currently writing a library in python. I have a package called Selectors as a sub-directory of the library. I am trying to implement a new module in the package, but when I try to import it I get the error:
NameError: name '_RaceSelector__ResultSelector' is not defined
My directory looks like this:
Selectors
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── __pycache__
│ │ ├── SeasonSelector.cpython-38.pyc
│ │ ├── Selector.cpython-38.pyc
│ │ ├── __init__.cpython-38.pyc
│ │ ├── race_selector.cpython-38.pyc
│ │ ├── result_selector.cpython-38.pyc
│ │ └── season_selector.cpython-38.pyc
│ ├── race_selector.py
│ ├── race_selector.pyc
│ ├── result_selector.py
│ ├── result_selector.pyc
│ ├── season_selector.py
│ ├── season_selector.pyc
│ ├── selector.py
│ └── selector.pyc
I want to use the modules in race_selector.py, here is that file:
from .selector import __Selector
from .result_selector import __ResultSelector
class RaceSelector(__Selector):
data = []
loaded_races = []
header = []
result_selector = __ResultSelector()
selector.py
import os
import csv
class __Selector:
def __init__(self, file_name):
self.data_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), '../data/' + file_name + '.csv')
self.raw_data = self.load_data()
self.data = self.get_data()
result_selector.py
import os
from .selector import __Selector
class __ResultSelector(__Selector):
def __init__(self):
super().__init__('results')
I am able to import selector just fine and works as intended, but result_selector produces the error.
Thanks
When you do the following:
result_selector = __ResultSelector()
Python searches for _RaceSelector__ResultSelector because there is 2 underscores.
As mentioned in PEP8:
If your class is intended to be subclassed, and you have attributes that you do not want subclasses to use, consider naming them with double leading underscores and no trailing underscores. This invokes Python's name mangling algorithm, where the name of the class is mangled into the attribute name. This helps avoid attribute name collisions should subclasses inadvertently contain attributes with the same name.
What is the last executed function/method that can accept parsed data?
I'm writing a specific spider for parsing a backend. My idea is to:
create a list of pages and to "yield" over them (done),
create a public constants to hold parsed data,
pass the public data to Jinja2 and get an html file.
The structure:
scrapyspider/
├── myspider
│ ├── __init__.py
│ ├── items.py
│ ├── pipelines.py
│ ├── settings.py
│ └── spiders
│ └── the_spider.py
├── scrapy.cfg
└── template.html
All of my code is in the_spider.py, and I would like to know the last executed function to which I can pass parsed data from yielding/parsing? I keep getting lost in parallel executions of yield and callbacks.
If the answer is to pass each "page" to a pipeline, how to do that?
Thanks!