I'm trying to build a webapp with flask, Mysql, SQLAlchemy and Alembic. But I'm not able to understand how imports work in python and how to set up my target_metadata to be able to use revision --autogenerate
Here is my directory's tree:
My website's init look like this:
import os
from flask import Flask
app = Flask(__name__, static_folder=os.path.join(os.path.dirname(os.path.abspath(__file__)), '../static'))
app.config.from_pyfile('config.py', silent=True)
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.ext.declarative import declarative_base
engine = create_engine(app.config['SQLALCHEMY_DATABASE_URI'], convert_unicode=True)
db_session = scoped_session(sessionmaker(autocommit=False,
autoflush=False,
bind=engine))
Base = declarative_base()
Base.query = db_session.query_property()
#app.teardown_appcontext
def shutdown_session(exception=None):
db_session.remove()
from website import views
Then in my env.py when I try to import my Base like this:
from website import Base
target_metadata = Base.metadata
and try to run alembic revision --autogenerate ... I get this error:
ImportError: No module named website.
And When I try to import Baselike this:
from website import Base
target_metadata = Base.metadata
I get this error: ValueError: Attempted relative import in non-package.
Please can you help me to understand how import works in python and how can I set my target_metadata ?
I've just recently had this problem myself, though not with flask. What worked for me is simple, but it seems to be necessary (the current directory isn't on the pythonpath, so when you do from website import Base, python is throwing an exception because it can't find the website module).
Try adding this at the top of your env.py module:
import os
import sys
sys.path.append(os.getcwd())
It's a really hacky way to do it, but it works for me.
Also, just curious here... Is there any reason you aren't using libraries that do just about all of this for you? Think ones like flask-sqlalchemy, flask-migrate or flask-alembic (I forget which, but it wraps alembic for you).
If your unaware of these, you might want to check out the flask extension registry. Some really handy ones there.
Related
This question is an extension on my previous one here. I was suggested to put more to explain the problem. As the heading says, I am trying to find a way to avoid importing the application factory (create_app function) into a module that needs application context and were "import current_app as app" is not sufficient.
My problem is I have a circular import problem due to this create_app function which I need to pass in order to get the app_context.
In my __ini__.py, I have this:
# application/__init__.py
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_restful import Api
from application.resources.product import Product, Products
from application.resources.offer import Offer, Offers # HERE IS THE PROBLEM
api = Api()
db = SQLAlchemy()
api.add_resource(Product, "/product/<string:name>") # GET, POST, DELETE, PUT to my local database
api.add_resource(Products, "/products") # GET all products from my local database
api.add_resource(Offer, "/offer/<int:id>") # POST call to the external Offers API microservise
api.add_resource(Offers, "/offers") # GET all offers from my local database
def create_app(config_filename=None):
""" Initialize core application. """
app = Flask(__name__, instance_relative_config=False)
app.config.from_object("config.Config")
db.init_app(app)
api.init_app(app)
with app.app_context():
db.create_all()
return app
The problem is in this line:
from application.resources.offer import Offer, Offers # HERE IS THE PROBLEM
because in that module, I have:
#application/resources/offer.py
from flask_restful import Resource
from application.models.offer import OfferModel # IMPORTING OFFER MODEL
which in turn imports application/models/offer.py where I have the critical part:
#application/models/offer.py
import requests
# from flask import current_app as app
from application import create_app # THIS CAUSES THE CIRCULAR IMPORT ERROR
from sqlalchemy.exc import OperationalError
app = create_app() # I NEED TO CREATE THE APP IN ORDER TO GET THE APP CONTEXT BECASE IN THE CLASS I HAVE SOME FUNCTIONS THAT NEED IT
class OfferModel(db.Model):
""" Data model for offers. """
# some code to instantiate the class... + other methods..
# THIS IS ONE OF THE METHODS THAT NEED APP_CONTEXT OR ELSE IT WILL ERROR OUT
#classmethod
def update_offer_price(cls):
""" Call offers api to get new prices. This function will run in a separated thread in a scheduler. """
with app.app_context():
headers = {"Bearer": app.config["MS_API_ACCESS_TOKEN"]}
for offer_id in OfferModel.offer_ids:
offers_url = app.config["MS_API_OFFERS_BASE_URL"] + "/products/" + str(offer_id) + "/offers"
res = requests.get(offers_url, headers=headers).json()
for offer in res:
try:
OfferModel.query.filter_by(offer_id=offer["id"]).update(dict(price=offer["price"]))
db.session.commit()
except OperationalError:
print("Database does not exists.")
db.session.rollback()
I have tried to use from flask import current_app as app to get the context, it did not work. I don't know why it was not sufficient to pass current_app as app and get the context because it now forces me to pass the create_app application factory which causes the circular import problem.
Your update_offer_price method needs database interaction and an access to the configuration. It gets them from the application context but it works only if your Flask application is initialized. This method is run in a separate thread so you create the second instance of Flask application in this thread.
Alternative way is getting standalone database interaction and configuration access outside the application context.
Configuration
Configuration does not seem a problem as your application gets it from another module:
app.config.from_object("config.Config")
So you can directly import this object to your offer.py:
from config import Config
headers = {"Bearer": Config.MS_API_ACCESS_TOKEN}
Database access
To get standalone database access you need to define your models via SQLAlchemy instead of flask_sqlalchemy. It was already described in this answer but I post here the essentials. For your case it may look like this. Your base.py module:
from sqlalchemy import MetaData
from sqlalchemy.ext.declarative import declarative_base
metadata = MetaData()
Base = declarative_base(metadata=metadata)
And offer.py module:
import sqlalchemy as sa
from .base import Base
class OfferModel(Base):
id = sa.Column(sa.Integer, primary_key=True)
# Another declarations
The produced metadata object is used to initialize your flask_sqlalchemy object:
from flask_sqlalchemy import SQLAlchemy
from application.models.base import metadata
db = SQLAlchemy(metadata=metadata)
Your models can be queried outside the application context but you need to manually create database engine and sessions. For example:
from sqlalchemy import create_engine
from sqlalchemy.orm import Session
from config import Config
from application.models.offer import Offer
engine = create_engine(Config.YOUR_DATABASE_URL)
# It is recommended to create a single engine
# and use it afterwards to bind database sessions to.
# Perhaps `application.models.base` module
# is better to be used for this declaration.
def your_database_interaction():
session = Session(engine)
offers = session.query(Offer).all()
for offer in offers:
# Some update here
session.commit()
session.close()
Note that with this approach you can't use your models classes for queriing, I mean:
OfferModel.query.all() # Does not work
db.session.query(OfferModel).all() # Works
ok so this is how I solved it. I made a new file endpoints.py where I put all my Api resources
# application/endpoints.py
from application import api
from application.resources.product import Product, Products
from application.resources.offer import Offer, Offers
api.add_resource(Product, "/product/<string:name>") # GET, POST, DELETE, PUT - calls to local database
api.add_resource(Products, "/products") # GET all products from local database.
api.add_resource(Offer, "/offer/<int:id>") # POST call to the Offers API microservice.
api.add_resource(Offers, "/offers") # GET all offers from local database
Then in init.py I import it at the very bottom.
# aplication/__init__.py
from flask import Flask
from flask_restful import Api
from db import db
api = Api()
def create_app():
app = Flask(__name__, instance_relative_config=False)
app.config.from_object("config.Config")
db.init_app(app)
api.init_app(app)
with app.app_context():
from application import routes
db.create_all()
return app
from application import endpoints # importing here to avoid circular imports
It is not very pretty but it works.
I'm building a web app using Flask and a PostGIS database I already created. I'm struggling to get Flask-SQLAlchemy to accept the geom column of my existing database. I declare db in an init.py file:
from flask import Flask, request, current_app
from config import Config
from flask_sqlalchemy import SQLAlchemy
from flask_migrate import Migrate
import os, logging
db = SQLAlchemy()
migrate = Migrate()
def create_app(config_class=Config):
app = Flask(__name__)
app.config.from_object(config_class)
db.init_app(app)
migrate.init_app(app, db)
My code for my models.py file looks like this:
from app import login, db
from datetime import datetime
from geoalchemy2 import Geometry
from time import time
from flask import current_app
class Streets(db.Model):
id = db.Column(db.Integer, primary_key=True)
street = db.Column(db.String(50))
geom = db.GeometryColumn(db.LineString(2))
The error I get is: AttributeError: 'SQLAlchemy' object has no attribute 'GeometryColumn'
And if I try to remove db. from the geom line, I get this error: NameError: name 'GeometryColumn' is not defined
Because Flask-SQLAlchemy wants you to declare a column using db.Column, it seems to override geoalchemy2. Has anyone found a solution to this?
It does not override GeoAlchemy2. You could use GeometryColumn as a column, if you were using the previous version of GeoAlchemy.
The first error is caused by the fact that the SQLAlchemy object from Flask-SQLAlchemy gives you access to functions etc. from sqlalchemy and sqlalchemy.orm. It does not include stuff from GeoAlchemy2 or such.
The second error is due to not having the name GeometryColumn in scope. You do import Geometry from geoalchemy2, but don't use it.
Reading the GeoAlchemy2 ORM tutorial you'd notice that geometry columns are defined as
geom = Column(Geometry('POLYGON'))
or in your case
geom = db.Column(Geometry('LINESTRING')) # dimension defaults to 2
Note that db.Column is sqlalchemy.schema.Column, just in another namespace.
I have a couple of modules: start.py, user.py, projects.py
In start.py I have:
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'my_database_URI'
db = SQAlchemy(app)
db.createAll()
I need to use the db object from both user.py and projects.py. If I import it like so:
from start import db
then I get an error if I do this in both modules. If I only import it to user.py, for example - then it works fine. The error I'm getting is "ImportError: cannot import name db".
Is there a way to solve this?
Sounds like this is a circular import problem.
The way that I've gotten around this is by having another file, a shared.py file in the root directory. In that file, create the database object,
from flask.ext.sqlalchemy import SQLAlchemy
db = SQLAlchemy()
In your start.py, don't create a new db object. Instead, do
from shared import db
db.init_app(app)
In any place that you want to use the db object, including your models file, import it from shared.py:
from shared import db
# do stuff with db
This way, the object in the shared file will have been initialized with the app context, and there's no chance of circular imports.
I'd like to create a Whoosh index from entries in the database connected to my Pyramid application. However, I'm not really sure how to access the database outside of application.
So my models.py is initialized as follows:
from sqlalchemy import (
Column,
Integer,
Text,
String,
ForeignKey,
Table
)
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import (
scoped_session,
sessionmaker,
relationship,
backref
)
from sqlalchemy.dialects.mysql import DATETIME, FLOAT, TEXT
from zope.sqlalchemy import ZopeTransactionExtension
db_session = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))
dbBase = declarative_base()
dbBase.query = db_session.query_property()
Then in __init__.py, there is an example of loading in the models:
from pyramid.config import Configurator
from sqlalchemy import engine_from_config
from .models import db_session, Recipe
def main(global_config, **settings):
""" This function returns a Pyramid WSGI application.
"""
engine = engine_from_config(settings, 'sqlalchemy.')
db_session.configure(bind=engine)
my production.ini has the engine assignment:
sqlalchemy.url = mysql+pymysql://username:password#localhost:3306/database?charset=utf8
So main is called when the WSGI process is started, which passes the engine from the .ini file. But I'd like to access the database through a script that does not rely on the WSGI process. Can I just assign the engine and bind it to the session in the script? How does the extension=ZopeTransactionExtension() affect the session?
The alchemy scaffold contains an initialize script you can use as an example. The setup looks like the following example, that I commented for you.
config_uri = argv[1] # Get config file name from arguments
setup_logging(config_uri) # In case you want ti use the logging config from the file
settings = get_appsettings(config_uri) # Get a settings dir from the file
engine = engine_from_config(settings, 'sqlalchemy.') # Setup the engine from the settings
DBSession.configure(bind=engine) # Configure the session to use the engine
with transaction.manager: # Do stuff in a transaction
# Do DB stuff
The ZopeTransactionExtension just means db work needs to be committed, so your either end your code with transaction.commit(), or you wrap it into a with transaction.manager:.
There is a section in the Pyramid documentation which deals with writing scripts, however it's buried in the Command-Line section. The pertinent part is that initializedb.py has been converted into a console script, which creates a script in the bin directory. This is why models is imported using relative importing.
This seemed a bit superflous for my needs at the moment, so I still needed something simpler. The solution was to include:
if __name__ == '__main__':
main()
in my script and then call the script from the directory containing my production.ini file with:
../bin/python -m myproject.scripts.whooshindex production.ini
The -m runs the module as a script. This fixes the relative importing, thereby employing all the benefits of the predefined initializedb.py script.
I am using the sqlalchemy expression language for its notation and connection pooling to create dao objects for communicating with the persistence layer. I wanted to get some opinions on how I should approach setting up the metadata and engine so that they are available to the applications view callables. According to sqlalchemy's documentation http://docs.sqlalchemy.org/en/rel_0_7/core/connections.html, they are typically bound and declared global, however I've neither this or the singleton approach are good ideas. Any thoughts would be appreciated...
This is what my __init__.py file looks like inside my project's directory:
from pyramid.config import Configurator
from sqlalchemy import engine_from_config, MetaData, create_engine
from pyramid_beaker import session_factory_from_settings
db_url = 'postgresql://user:password#localhost/dbname'
engine = create_engine(db_url)
meta = MetaData()
def main(global_config, **settings):
meta.bind = engine
.
.
.
[other configuration settings]
The Pyramid documentation includes a tutorial on integrating Pyramid with SQLAlchemy.
There are two special packages that integrate SQLAlchemy transactions and session management with Pyramid, pyramid_tm and zope.sqlalchemy. These together take care of your sessions:
from sqlalchemy import engine_from_config
from .models import DBSession
def main(global_config, **settings):
"""This function returns a Pyramid WSGI application."""
engine = engine_from_config(settings, 'sqlalchemy.')
DBSession.configure(bind=engine)
# Configuration setup
Here we take the configuration settings from your .ini configuration file; and in models.py:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import (
scoped_session,
sessionmaker,
)
from zope.sqlalchemy import ZopeTransactionExtension
DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))
Base = declarative_base()
class YourModel(Base):
# Define your model
Note the use of a scoped_session there, using the transaction extension to integrate with Pyramid.
Then in views, all you need to do is use the DBSession session factory to get your sessions:
from pyramid.view import view_config
from .models import (
DBSession,
YourModel,
)
#view_config(...)
def aview(request):
result = DBSession.query(YourModel).filter(...).first()
Committing and rolling back will be integrated with the request; commit on 2xx and 3xx, rollback on exceptions, for example.
I think the sqlalchemy doc examples declare them as global for succinctness and not to indicate that they recommend that.
I think the only thing you really want to pass around to the different parts of your application is a Session object. The simpler option there is to use a scoped session (which I seem to recall the O'Reilly sqlalchemy book in fact recommends for simpler web based applications; your code suggests it's a web app). I think there's very few applications for needing the engine or metadata in any location other than when you're instantiating the database connection.
The scoped session would also be created when the engine and metadata are created, upon app startup (in the case of pyramid, in the main function here). Then you'd pass it as a parameter to the various parts of your application that need database access.