When working with modular imports with FastAPI and SQLModel, I am getting the following error if I open /docs:
TypeError: issubclass() arg 1 must be a class
Python 3.10.6
pydantic 1.10.2
fastapi 0.85.2
sqlmodel 0.0.8
macOS 12.6
Here is a reproducible example.
user.py
from typing import List, TYPE_CHECKING, Optional
from sqlmodel import SQLModel, Field
if TYPE_CHECKING:
from item import Item
class User(SQLModel):
id: int = Field(default=None, primary_key=True)
age: Optional[int]
bought_items: List["Item"] = []
item.py
from sqlmodel import SQLModel, Field
class Item(SQLModel):
id: int = Field(default=None, primary_key=True)
price: float
name: str
main.py
from fastapi import FastAPI
from user import User
app = FastAPI()
#app.get("/", response_model=User)
def main():
return {"message": "working just fine"}
I followed along the tutorial from sqlmodel https://sqlmodel.tiangolo.com/tutorial/code-structure/#make-circular-imports-work.
If I would put the models in the same file, it all works fine. As my actual models are quite complex, I need to rely on the modular imports though.
Traceback:
Traceback (most recent call last):
File "/Users/felix/opt/anaconda3/envs/fastapi_test/lib/python3.10/site-packages/fastapi/utils.py", line 45, in get_model_definitions
m_schema, m_definitions, m_nested_models = model_process_schema(
File "pydantic/schema.py", line 580, in pydantic.schema.model_process_schema
File "pydantic/schema.py", line 621, in pydantic.schema.model_type_schema
File "pydantic/schema.py", line 254, in pydantic.schema.field_schema
File "pydantic/schema.py", line 461, in pydantic.schema.field_type_schema
File "pydantic/schema.py", line 847, in pydantic.schema.field_singleton_schema
File "pydantic/schema.py", line 698, in pydantic.schema.field_singleton_sub_fields_schema
File "pydantic/schema.py", line 526, in pydantic.schema.field_type_schema
File "pydantic/schema.py", line 921, in pydantic.schema.field_singleton_schema
File "/Users/felix/opt/anaconda3/envs/fastapi_test/lib/python3.10/abc.py", line 123, in __subclasscheck__
return _abc_subclasscheck(cls, subclass)
TypeError: issubclass() arg 1 must be a class
TL;DR
You need to call User.update_forward_refs(Item=Item) before the OpenAPI setup.
Explanation
So, this is actually quite a bit trickier and I am not quite sure yet, why this is not mentioned in the docs. Maybe I am missing something. Anyway...
If you follow the traceback, you'll see that the error occurs because in line 921 of pydantic.schema in the field_singleton_schema function a check is performed to see if issubclass(field_type, BaseModel) and at that point field_type is not in fact a type instance.
A bit of debugging reveals that this occurs, when the schema for the User model is being generated and the bought_items field is being processed. At that point the annotation is processed and the type argument for List is still a forward reference to Item. Meaning it is not the actual Item class itself. And that is what is passed to issubclass and causes the error.
This is a fairly common problem, when dealing with recursive or circular relationships between Pydantic models, which is why they were so kind to provide a special method just for that. It is explained in the Postponed annotations section of the documentation. The method is update_forward_refs and as the name suggests, it is there to resolve forward references.
What is tricky in this case, is that you need to provide it with an updated namespace to resolve the Item reference. To do that you need to actually have the real Item class in scope because that is what needs to be in that namespace. Where you do it does not really matter. You could for example import User model into your item module and call it there (obviously below the definition of Item):
from sqlmodel import SQLModel, Field
from .user import User
class Item(SQLModel):
id: int = Field(default=None, primary_key=True)
price: float
name: str
User.update_forward_refs(Item=Item)
But that call needs to happen before an attempt is made to set up that schema. Thus you'll at least need to import the item module in your main module:
from fastapi import FastAPI
from .user import User
from . import item
api = FastAPI()
#api.get("/", response_model=User)
def main():
return {"message": "working just fine"}
At that point it is probably simpler to have a sub-package with just the model modules and import all of them in the __init__.py of that sub-package.
The reason I gave the example of putting the User.update_forward_refs call in below your Item definition is that these situations typically occur, when you actually have a circular relationship, i.e. if your Item class had a users field for example, which was typed as list[User]. Then you'd have to import User there anyway and might as well just update the references there.
In your specific example, you don't actually have any circular dependencies, so there is strictly speaking no need for the TYPE_CHECKING escape. You can simply do from .item import Item inside user.py and put the actual class in your annotation as bought_items: list[Item]. But I assume you simplified the actual use case and simply forgot to include the circular dependency.
Maybe I am missing something and someone else here can find a way to call update_forward_refs without the need to provide Item explicitly, but this way should definitely work.
For anyone ending up here who (just like me) got the same error but couldn't resolve it using the solution above, my script looked like this. It seems that SQLModel relies on the pydantic.BaseModel so this solution also applies here.
from pydantic import BaseModel
class Model(BaseModel):
values: list[int, ...]
class SubModel(Model):
values = list[int, int, int]
It took me a long time to realize what my mistake was, but in SubModel I used = (assignment) whereas I should have used : (type hint).
The strangest thing was that it did work in a docker container (Linux) but not locally (Windows). Also, mypy did not pick up on this.
I want to have deafult values for Base class variables while not having default values for derived class variables.
I have read about 'kw_only' at:
https://medium.com/#aniscampos/python-dataclass-inheritance-finally-686eaf60fbb5
And tried it on my code:
from dataclasses import dataclass, field
#dataclass(kw_only=True)
class Base:
c:int = field(default=8, compare=False)
def printBVars(self):
print("Base:", self.c)
#dataclass(kw_only=True)
class Derive(Base):
cc:int = field()
def printDVars(self):
super().printBVars()
print("Derive:", self.cc)
a = Derive()
a.printDVars()
But Python is givving this error:
File "<string>", line 8, in <module> TypeError: dataclass() got an unexpected keyword argument 'kw_only'
What am I doing wrong?
Im using python 3.9
I have read about some solution suggestions from google, tried some but did not help. such as running:
pip install attrs --upgrade
In one application I have code which generates dynamic classes which reduces the amount of duplicated code considerably. But adding type-hints for mypy checking resulted in an error. Consider the following example code (simplified to focus on the relevant bits):
class Mapper:
#staticmethod
def action() -> None:
raise NotImplementedError('Not yet implemnented')
def magic(new_name: str) -> type:
cls = type('%sMapper' % new_name.capitalize(), (Mapper,), {})
def action() -> None:
print('Hello')
cls.action = staticmethod(action)
return cls
MyCls = magic('My')
MyCls.action()
Checking this with mypy will result in the following error:
dynamic_type.py:15: error: "type" has no attribute "action"
dynamic_type.py:21: error: "type" has no attribute "action"
mypy is obviously unable to tell that the return-value from the type call is a subclass of Mapper, so it complains that "type" has not attribute "action" when I assign to it.
Note that the code functions perfectly and does what it is supposed to but mypy still complains.
Is there a way to flag cls as being a type of Mapper? I tried to simply append # type: Mapper to the line which creates the class:
cls = type('%sMapper' % new_name.capitalize(), (Mapper,), {}) # type: Mapper
But then I get the following errors:
dynamic_type.py:10: error: Incompatible types in assignment (expression has type "type", variable has type "Mapper")
dynamic_type.py:15: error: Cannot assign to a method
dynamic_type.py:15: error: Incompatible types in assignment (expression has type "staticmethod", variable has type "Callable[[], None]")
dynamic_type.py:16: error: Incompatible return value type (got "Mapper", expected "type")
dynamic_type.py:21: error: "type" has no attribute "action"
One possible solution is basically to:
Type your magic function with the expected input and output types
Leave the contents of your magic function dynamically typed with judicious use of Any and # type: ignore
For example, something like this would work:
class Mapper:
#staticmethod
def action() -> None:
raise NotImplementedError('Not yet implemnented')
def magic(new_name: str) -> Mapper:
cls = type('%sMapper' % new_name.capitalize(), (Mapper,), {})
def action() -> None:
print('Hello')
cls.action = staticmethod(action) # type: ignore
return cls # type: ignore
MyCls = magic('My')
MyCls.action()
It may seem slightly distasteful to leave a part of your codebase dynamically typed, but in this case, I don't think there's any avoiding it: mypy (and the PEP 484 typing ecosystem) deliberately does not try and handle super-dynamic code like this.
Instead, the best you can do is to cleanly document the "static" interface, add unit tests, and keep the dynamic portions of your code confined to as small of region as possible.
I am getting a strange error when I run nosetests:
======================================================================
ERROR: Extract test data from tarball.
----------------------------------------------------------------------
TypeError: extract_test_data() missing 1 required positional argument: 'calling_file'
The code in question is split over two files:
tests/core.py
class CoreTestCase(unittest.TestCase):
#classmethod
def extract_test_data(cls, calling_file, base='data', name_only=False):
"""Extract test data from tarball.
...
"""
...
tests/.../test_this.py
class TestThis(core.CoreTestCase):
"""Run some tests."""
#classmethod
def setUpClass(cls):
cls.TESTDAT_DIR = cls.extract_test_data(__file__)
The imports, etc., work correctly, and unittest does not have any trouble. But for some reason, nose is mangling the call.
I've tried all of the following:
cls.TESTDAT_DIR = cls.extract_test_data(calling_file=__file__)
cls.TESTDAT_DIR = cls.extract_test_data(cls,__file__)
cls.TESTDAT_DIR = cls.extract_test_data(cls, calling_file=__file__)
but then I still get an odd assortmenterrors:
TypeError: extract_test_data() got multiple values for argument 'calling_file'
AttributeError: type object 'TestThis' has no attribute 'TESTDAT_DIR'
nose is trying to run extract_test_data like it's a unit test. Rename it to exclude the token test or add this to extract_test_data:
from nose.tools import nottest
class CoreTestCase(unittest.TestCase):
#nottest
#classmethod
def extract_test_data(cls, calling_file, base='data', name_only=False):
"""Extract test data from tarball.
...
"""
...
EDIT: link to the docs where it is explained that, by default, the testMatch regular expression will run any function that has test or Test at a word boundary or following a - or _
I'm writing a piece of software over on github. It's basically a tray icon with some extra features. I want to provide a working piece of code without actually having to make the user install what are essentially dependencies for optional features and I don't actually want to import things I'm not going to use so I thought code like this would be "good solution":
---- IN LOADING FUNCTION ----
features = []
for path in sys.path:
if os.path.exists(os.path.join(path, 'pynotify')):
features.append('pynotify')
if os.path.exists(os.path.join(path, 'gnomekeyring.so')):
features.append('gnome-keyring')
#user dialog to ask for stuff
#notifications available, do you want them enabled?
dlg = ConfigDialog(features)
if not dlg.get_notifications():
features.remove('pynotify')
service_start(features ...)
---- SOMEWHERE ELSE ------
def service_start(features, other_config):
if 'pynotify' in features:
import pynotify
#use pynotify...
There are some issues however. If a user formats his machine and installs the newest version of his OS and redeploys this application, features suddenly disappear without warning. The solution is to present this on the configuration window:
if 'pynotify' in features:
#gtk checkbox
else:
#gtk label reading "Get pynotify and enjoy notification pop ups!"
But if this is say, a mac, how do I know I'm not sending the user on a wild goose chase looking for a dependency they can never fill?
The second problem is the:
if os.path.exists(os.path.join(path, 'gnomekeyring.so')):
issue. Can I be sure that the file is always called gnomekeyring.so across all the linux distros?
How do other people test these features? The problem with the basic
try:
import pynotify
except:
pynotify = disabled
is that the code is global, these might be littered around and even if the user doesn't want pynotify....it's loaded anyway.
So what do people think is the best way to solve this problem?
The try: method does not need to be global — it can be used in any scope and so modules can be "lazy-loaded" at runtime. For example:
def foo():
try:
import external_module
except ImportError:
external_module = None
if external_module:
external_module.some_whizzy_feature()
else:
print("You could be using a whizzy feature right now, if you had external_module.")
When your script is run, no attempt will be made to load external_module. The first time foo() is called, external_module is (if available) loaded and inserted into the function's local scope. Subsequent calls to foo() reinsert external_module into its scope without needing to reload the module.
In general, it's best to let Python handle import logic — it's been doing it for a while. :-)
You might want to have a look at the imp module, which basically does what you do manually above. So you can first look for a module with find_module() and then load it via load_module() or by simply importing it (after checking the config).
And btw, if using except: I always would add the specific exception to it (here ImportError) to not accidently catch unrelated errors.
Not sure if this is good practice, but I created a function that does the optional import (using importlib) and error handling:
def _optional_import(module: str, name: str = None, package: str = None):
import importlib
try:
module = importlib.import_module(module)
return module if name is None else getattr(module, name)
except ImportError as e:
if package is None:
package = module
msg = f"install the '{package}' package to make use of this feature"
raise ValueError(msg) from e
If an optional module is not available, the user will at least get the idea what to do. E.g.
# code ...
if file.endswith('.json'):
from json import load
elif file.endswith('.yaml'):
# equivalent to 'from yaml import safe_load as load'
load = _optional_import('yaml', 'safe_load', package='pyyaml')
# code using load ...
The main disadvantage with this approach is that your imports have to be done in-line and are not all on the top of your file. Therefore, it might be considered better practice to use a slight adaptation of this function (assuming that you are importing a function or the like):
def _optional_import_(module: str, name: str = None, package: str = None):
import importlib
try:
module = importlib.import_module(module)
return module if name is None else getattr(module, name)
except ImportError as e:
if package is None:
package = module
msg = f"install the '{package}' package to make use of this feature"
import_error = e
def _failed_import(*args):
raise ValueError(msg) from import_error
return _failed_import
Now, you can make the imports with the rest of your imports and the error will only be raised when the function that failed to import is actually used. E.g.
from utils import _optional_import_ # let's assume we import the function
from json import load as json_load
yaml_load = _optional_import_('yaml', 'safe_load', package='pyyaml')
# unimportant code ...
with open('test.txt', 'r') as fp:
result = yaml_load(fp) # will raise a value error if import was not successful
PS: sorry for the late answer!
Another option is to use #contextmanager and with. In this situation, you do not know beforehand which dependencies are needed:
from contextlib import contextmanager
#contextmanager
def optional_dependencies(error: str = "ignore"):
assert error in {"raise", "warn", "ignore"}
try:
yield None
except ImportError as e:
if error == "raise":
raise e
if error == "warn":
msg = f'Missing optional dependency "{e.name}". Use pip or conda to install.'
print(f'Warning: {msg}')
Usage:
with optional_dependencies("warn"):
import module_which_does_not_exist_1
import module_which_does_not_exist_2
z = 1
print(z)
Output:
Warning: Missing optional dependency "module_which_does_not_exist_1". Use pip or conda to install.
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
Cell In [43], line 5
3 import module_which_does_not_exist_2
4 z = 1
----> 5 print(z)
NameError: name 'z' is not defined
Here, you should define all your imports immediately after with. The first module which is not installed will throw ImportError, which is caught by optional_dependencies. Depending on how you want to handle this error, it will either ignore it, print a warning, or raise it again.
The entire code will only run if all the modules are installed.
Here's a production-grade solution, using importlib and Pandas's import_optional_dependency as suggested by #dre-hh
from typing import *
import importlib, types
def module_exists(
*names: Union[List[str], str],
error: str = "ignore",
warn_every_time: bool = False,
__INSTALLED_OPTIONAL_MODULES: Dict[str, bool] = {}
) -> Optional[Union[Tuple[types.ModuleType, ...], types.ModuleType]]:
"""
Try to import optional dependencies.
Ref: https://stackoverflow.com/a/73838546/4900327
Parameters
----------
names: str or list of strings.
The module name(s) to import.
error: str {'raise', 'warn', 'ignore'}
What to do when a dependency is not found.
* raise : Raise an ImportError.
* warn: print a warning.
* ignore: If any module is not installed, return None, otherwise,
return the module(s).
warn_every_time: bool
Whether to warn every time an import is tried. Only applies when error="warn".
Setting this to True will result in multiple warnings if you try to
import the same library multiple times.
Returns
-------
maybe_module : Optional[ModuleType, Tuple[ModuleType...]]
The imported module(s), if all are found.
None is returned if any module is not found and `error!="raise"`.
"""
assert error in {"raise", "warn", "ignore"}
if isinstance(names, (list, tuple, set)):
names: List[str] = list(names)
else:
assert isinstance(names, str)
names: List[str] = [names]
modules = []
for name in names:
try:
module = importlib.import_module(name)
modules.append(module)
__INSTALLED_OPTIONAL_MODULES[name] = True
except ImportError:
modules.append(None)
def error_msg(missing: Union[str, List[str]]):
if not isinstance(missing, (list, tuple)):
missing = [missing]
missing_str: str = ' '.join([f'"{name}"' for name in missing])
dep_str = 'dependencies'
if len(missing) == 1:
dep_str = 'dependency'
msg = f'Missing optional {dep_str} {missing_str}. Use pip or conda to install.'
return msg
missing_modules: List[str] = [name for name, module in zip(names, modules) if module is None]
if len(missing_modules) > 0:
if error == "raise":
raise ImportError(error_msg(missing_modules))
if error == "warn":
for name in missing_modules:
## Ensures warning is printed only once
if warn_every_time is True or name not in __INSTALLED_OPTIONAL_MODULES:
print(f'Warning: {error_msg(name)}')
__INSTALLED_OPTIONAL_MODULES[name] = False
return None
if len(modules) == 1:
return modules[0]
return tuple(modules)
Usage: ignore errors (error="ignore", default behavior)
Suppose we want to run certain code only if the required libraries exists:
if module_exists("pydantic", "sklearn"):
from pydantic import BaseModel
from sklearn.metrics import accuracy_score
class AccuracyCalculator(BaseModel):
num_decimals: int = 5
def calculate(self, y_pred: List, y_true: List) -> float:
return round(accuracy_score(y_true, y_pred), self.num_decimals)
print("Defined AccuracyCalculator in global context")
If either dependencies pydantic or skelarn do not exist, then the class AccuracyCalculator will not be defined and the print statement will not run.
Usage: raise ImportError (error="raise")
Alternatively, you can raise a error if any module does not exist:
if module_exists("pydantic", "sklearn", error="raise"):
from pydantic import BaseModel
from sklearn.metrics import accuracy_score
class AccuracyCalculator(BaseModel):
num_decimals: int = 5
def calculate(self, y_pred: List, y_true: List) -> float:
return round(accuracy_score(y_true, y_pred), self.num_decimals)
print("Defined AccuracyCalculator in global context")
Output:
line 60, in module_exists(error, __INSTALLED_OPTIONAL_MODULES, *names)
58 if len(missing_modules) > 0:
59 if error == "raise":
---> 60 raise ImportError(error_msg(missing_modules))
61 if error == "warn":
62 for name in missing_modules:
ImportError: Missing optional dependencies "pydantic" "sklearn". Use pip or conda to install.
Usage: print a warning (error="warn")
Alternatively, you can print a warning if the module does not exist.
if module_exists("pydantic", "sklearn", error="warn"):
from pydantic import BaseModel
from sklearn.metrics import accuracy_score
class AccuracyCalculator(BaseModel):
num_decimals: int = 5
def calculate(self, y_pred: List, y_true: List) -> float:
return round(accuracy_score(y_true, y_pred), self.num_decimals)
print("Defined AccuracyCalculator in global context")
if module_exists("pydantic", "sklearn", error="warn"):
from pydantic import BaseModel
from sklearn.metrics import roc_auc_score
class RocAucCalculator(BaseModel):
num_decimals: int = 5
def calculate(self, y_pred: List, y_true: List) -> float:
return round(roc_auc_score(y_true, y_pred), self.num_decimals)
print("Defined RocAucCalculator in global context")
Output:
Warning: Missing optional dependency "pydantic". Use pip or conda to install.
Warning: Missing optional dependency "sklearn". Use pip or conda to install.
Here, we ensure that only one warning is printed for each missing module, otherwise you would get a warning each time you try to import.
This is very useful for Python libraries where you might try to import the same optional dependencies many times, and only want to see one Warning.
You can pass warn_every_time=True to always print the warning when you try to import.
I'm really excited to share this new technique I came up with to handle optional dependencies!
The concept is to produce the error when the uninstalled package is used not imported.
Just add a single call before your imports. You don't need to change any code at all. No more using try: when importing. No more using conditional skip decorators when writing tests.
Main components
An importer to return a fake module for missing imports
A fake module that raises an exception when it's used
A custom Exception that will skip tests automatically if raised within one
Minimal Code Example
import sys
import importlib
from unittest.case import SkipTest
from _pytest.outcomes import Skipped
class MissingOptionalDependency(SkipTest, Skipped):
def __init__(self, msg=None):
self.msg = msg
def __repr__(self):
return f"MissingOptionalDependency: {self.msg}" if self.msg else f"MissingOptionalDependency"
class GeneralImporter:
def __init__(self, *names):
self.names = names
sys.meta_path.insert(0, self)
def find_spec(self, fullname, path=None, target=None):
if fullname in self.names:
return importlib.util.spec_from_loader(fullname, self)
def create_module(self, spec):
return FakeModule(name=spec.name)
def exec_module(self, module):
pass
class FakeModule:
def __init__(self, name):
self.name = name
def __call__(self, *args, **kwargs):
raise MissingOptionalDependency(f"Optional dependency '{self.name}' was used but it isn't installed.")
GeneralImporter("notinstalled")
import notinstalled # No error
print(notinstalled) # <__main__.FakeModule object at 0x0000014B7F6D9E80>
notinstalled() # MissingOptionalDependency: Optional dependency 'notinstalled' was used but it isn't installed.
Package
The technique above has some shortcomings that my package fixes.
It's open-source, lightweight, and has no dependencies!
Some key differences to the example above:
Covers more than 100 dunder methods (All tested)
Covers 15 common dunder attribute lookups
Entry function is generalimport which returns an ImportCatcher
ImportCatcher holds names, scope, and caught names
It can be enabled and disabled
The scope prevents external packages from being affected
Wildcard support to allow any package to be imported
Puts the importer first in sys.meta_path
Lets it catch namespace imports (Usually occurs with uninstalled packages)
Generalimport on GitHub
pip install generalimport
Minimal example
from generalimport import generalimport
generalimport("notinstalled")
from notinstalled import missing_func # No error
missing_func() # Error occurs here
The readme on GitHub goes more in-depth
One way to handle the problem of different dependencies for different features is to implement the optional features as plugins. That way the user has control over which features are activated in the app but isn't responsible for tracking down the dependencies herself. That task then gets handled at the time of each plugin's installation.