I am new to python.
I am creating some programs that do an ETL and save the tables in a postgresql database.
The connection to the database is done in the following way.
I have a settings.ini file where I have the variable:
bdd_link = postgresql://postgres:adrian123#localhost:5432/prueba
In a config.py file I have the variable:
from decouple import AutoConfig
link_bdd = config("bdd_link")
And in a sql_con.py file I have the following:
from config import link_bdd
from sqlalchemy import create_engine
from sqlalchemy_utils import database_exists, create_database
def acceder_bdd():
url_bdd = link_bdd
if not database_exists(url_bdd):
create_database(url_bdd)
engine = create_engine(url_bdd, echo=False)
This all works when I run it normal.
But when I do it inside a virtual environment (venv), the create_tables() function, which in turn uses the acceder_bdd function, which I use to create tables inside postgresql, gives me the following error:
engine.connect().execute(f"DROP TABLE IF EXISTS {table}")
AttributeError: 'NoneType' object has no attribute 'connect'
It's like the create_engine(url_bdd, echo=False) method isn't working, and it makes me think that database "settings" don't work this way inside a virtual environment.
Related
I currently have a flask project set up as follows (I did make a few modifications here to try and get the smallest working example, so some of this may be changed slightly)
from extensions import db
def create_api():
# Create API app
api = Flask(__name__)
# Configure... the configs
api.config['SQLALCHEMY_DATABASE_URI'] = os.environ.get("DB_URL", "default")
# Register information to run api
register_extensions(api)
register_models(api)
register_urls(api)
# Return the api object
return api
create_api().run()
My register extension and register models methods look as follows:
def register_models(api):
with api.app_context():
db.create_all()
def register_extensions(api):
db.init_app(api)
bcrypt.init_app(api)
The extension module:
from flask_sqlalchemy import SQLAlchemy
from flask_bcrypt import Bcrypt
# Establish SQLAlchemy Database extension
db = SQLAlchemy()
# Establish Bcrypt extension for hashing passwords
bcrypt = Bcrypt()
My project directory looks something like
-FSBS
-API
- __init__.py
- app.py
- extensions.py
- routes.py
Everything works great like this. However- If I change the import in app.py from
from extensions import db to from FSBS.API.extensions import db, all my attempts to use db start throwing "KeyError: <weakref at 0x00000251C4B1DAD0; to 'Flask' at 0x00000251C0ED3FA0>".
This is somewhat problematic because I would like to start refactoring my routes into a subfolder, where I have to use from FSBS.API.extensions import db.
Not only that, but I don't understand why this would make a difference, so any advice on solving this little puzzle would be greatly appriciated.
I've recently encountered the exact same problem while trying to put my routes into a submodule.
I can't explain why this happens, but from what I can tell, it has something to do with the Flask-SQLAlchemy version - I've initially tried running version 3.0.2.
What solved the problem for me was a downgrade to version 2.5.1
This code runs perfectly fine if I first redirect to the folder where this main.py file is located. So in cmd I just type: python main.py. In this folder is also my database "Datalog.db" located.
If I run this python file from somewhere else, I get a problem with this line of code: cur.execute(sql). So in cmd I type: python C:\Users\ [...] \main.py. I get following error: "sqlite3.OperationalError: no such table: Datalog". Later I want to include this python file in a pythonshell in node-red and there I have to define the full path of this main.py
I also tried to build an exe-file with it but then the same error occurs:"sqlite3.OperationalError: no such table: Datalog".
Apparently the connection to the database is not the issue, first my cur.execute command is not working.
I find out that I have to "include my SQLite database file in the include_files statement", but I have no idea how to do this ..
Can anybody help? I am very sorry for any inconvenience, I just started programming and this is my first post.
import sqlite3 as db
db_name = 'Datalog'
output_number = 'Output1'
output = 'hello'
timestamp = '2019-11-11 09:27:02'
db_name = f'{db_name}.db'
con = db.connect(db_name)
with con:
cur = con.cursor()
sql = f"UPDATE Datalog SET {output_number}='{output}' WHERE timestamp ='{timestamp}'"
cur.execute(sql)
con.commit()
print("### DB updated ###")
You can change db_name to the full path of your sqlite database or even better, dynamically build the path to the db (Below code assumes the db is in the same folder (directory) as the file which calls the below code ):
import os.path
db_name = os.path.dirname(os.path.abspath(__filename__)) + 'Datalog'
python looks for files in "sys.path" so you want to insert the name of the directory containing your file in that path so python can find it
here is an example:
import sys
sys.path.insert(0, r'C:\Users\Philip\Work\Bin')
where 0 is the position in the sys.path, 0 means top, 1 means second from top etc.
obviously you would replace my directory with yours
Introduction
I'm developing a python webapp running on Flask. One of the module I developed use sqlite3 to access a database file in one of my project directory. Locally it works like a charm, but I have issues to make it run properly on pythonanywhere.
Code
Here's an insight of my module_database.py (both sql query are only SELECT):
import sqlite3
import os
PATH_DB = os.path.join(os.path.dirname(__file__), 'res/database.db')
db = sqlite3.connect(PATH_DB)
cursor = db.cursor()
def init():
cursor.execute(my_sql_query)
val = cursor.fetchone()
def process():
cursor.execute(another_sql_query)
another_val = cursor.fetchone()
I don't know if that's important but my module is imported like this:
from importlib import import_module
module = import_module(absolute_path_to_module)
module.init() # module init
And afterwards my webapp will regularly call:
module.process()
So, I have one access to the db in my init() and one access to the db in my process(). Both works when I run it locally.
Problem
I pulled my code via github on pythonanywhere, restarted the app and I can see in the log file that the access to the DB in the init() worked (I print a value, it's working fine)
But then, when my app calls the process() method I got a:
2017-11-06 16:27:55,551: File "/home/account-name/project-name/project_modules/module_database.py", line 71, in my_method
2017-11-06 16:27:55,551: cursor.execute(sql)
2017-11-06 16:27:55,552: sqlite3.DatabaseError: database disk image is malformed
I tried via the console to run an integrity check:
PRAGMA integrity_check;
and it prints OK
I'd be glad to hear if you have any idea where this could come from.
a small thing, and it may not fix your specific problem, but you should always call path.abspath on __file__ before calling path.dirname, otherwise you can get unpredictable results depending on how your code is imported/loaded/run
PATH_DB = os.path.join(
os.path.dirname(os.path.abspath(__file__)),
'res/database.db'
)
I am using Mongoengine(version: 0.9.0 ) with Django(version: 1.8).
This is my settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.dummy'
}
}
MONGO_DBNAME = "mydatabasename"
MONGO_HOSTNAME = "localhost"
connect(MONGO_DBNAME, host=MONGO_HOSTNAME)
I want to have fixtures for the application. I have created initial_data.json in myapp/fixtures/ location.
When I run the command python manage.py dumpdata , I get the following error :
CommandError: Unable to serialize database: settings.DATABASES is improperly configured. Please supply the ENGINE value. Check settings documentation for more details.
Questions:
1) Any workaround for this problem ?
2) Is there any other way to load the initial data ?
References at this link
Thank you
Mongoengine itsn a backend(in django terminology). Its has own models (schemas) and DOM (like ORM in docuemnt db's) but it dont have a Django backend adapters.
You can use it. But there is issue while workind with out-of-box Django solution like Tests, Fixtures, etc.
You need to write your own loader, sadenly but true.
I see 2 options here:
You can try to use Django MongoDB Engine
You can write your own loader for mongodb
Ill write my own fixture loader for tests.
I have a json file where mapped all fixture file ill need to load to db.
So a fast example here:
import bson
import os
from django.conf import settings
from mongoengine.connection import get_db
def _get_db(self):
self.db = get_db()
def _load_fixtures(self, clear_before_load=True):
"""
Load to db a fixtures from folder fixtures/{{DB_NAME}}/{{COLLECTION_NAME}} before each test.
In file fixtures.json mapped collection name and file name for it.
"""
fixture_path = lambda file_name: os.path.join(settings.FIXTURES_DIR, self.db.name, file_name)
with open(settings.COLLECTION_FIXTURES_PATH) as file_object:
db_collections = loads(file_object.read())
for collection_name, filename in db_collections.items():
collection = self.db[collection_name]
if clear_before_load:
collection.remove()
path = fixture_path(filename)
if os.path.exists(path) and os.path.isfile(path):
with open(path, 'r') as raw_data:
collection_data = bson.decode_all(raw_data.read())
for document in collection_data:
collection.save(document)
There is no support for fixtures on mongoengine, and I don't think the mongoengine team is continuing the plugin as of version 0.9.0.
What I ended up doing to load initial data for mongoDB is to create a script called startup.py in my project folder.
startup.py:
from {{app}}.models import Sample
def init():
if Sample.objects(name="test").count() == 0: # a flag to prevent initial data repetition
Sample(name="test").save()
Next is to run this script on Django's startup. The entry point of Django project is when DJANGO_SETTINGS_MODULE is first loaded at wsgi.py:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "{{project_name}}.settings")
import {{project_name}}.startup as startup
startup.init()
application = get_wsgi_application()
With this setup, when you run python manage.py runserver, the init() on startup.py will run and the data you set will be inserted to the DB.
Hope this helps.
I have a question about Celery.
I am calling a function named task and I want to return a list of a specific class.
But if I do this I get an error on my server:
No module named 'modelsgert'
modelsgert is the name of the python file where my class has been defined.
I have imported the same file to my project that is located on my server but yet he doesn't know this. Probably he sends a reference to the file location of the file on the celery server.
code celery server:
from celery import Celery
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from modelsgert import(
Diagnose,
Procedur,
DBSession,
Data
)
import time
celery = Celery('tasks', backend='amqp', broker='amqp://guest#localhost//')
#celery.task()
def test_task(data):
diagnose = DBSession.query(Diagnose)
listofdiagnoses = []
listofdiagnoses.append(diagnose[0])
listofdiagnoses.append(diagnose[1])
return (listofdiagnoses)
code Pyramid server
celery = Celery( backend='amqp', broker='amqp://guest#192.168.1.5:5672//')
celery.conf.update(CELERY_RESULT_BACKEND = 'amqp', BROKER_HOST='192.168.1.5', BROKER_USER='kristof', BROKER_PASSWORD='bob', BROKER_VHOST='myvhost', BROKER_PORT=5672)
task = celery.send_task('tasks.test_task',["kakker"])
TheData = task.get()
is there a way to fix this problem in a proper way?
Are you certain that modelsgert is available when you see that error?
Celery uses pickle by default, and that module indeed stores the the name of the module and class (together with the data contained in the class), and when loading the data again, the module and class are looked up dynamically. This stage fails because the modelsgert cannot be imported.
I must note that you are trying to send SQLAlchemy objects here, and that is very rarely a good idea. The objects are tied to a specific session, and when you unpickle the objects that session will no longer be there. Moveover, the objects represent database state, and the database state could easily have changed by the time you load the objects again.
You should, instead, send object identifiers, and query for the objects again on the other side. Instead of a list of Diagnose objects, send the primary keys instead:
listofdiagnoses = [d.id for d in diagnose]
On the other side, you'd then use those identifiers to load your Diagnose objects again from the database.