I have the functions that performs some CRUD operations with database like this:
def get_user_by_id(user_id):
db = get_db()
cursor = db.cursor(cursor_factory=extras.DictCursor)
cursor.execute(
'SELECT * FROM users WHERE id = %s',
(user_id, )
)
user = cursor.fetchone()
cursor.close()
return user
I want to test it with pytest, but don't know the right way to do it.
For example to get user data from database I should first create it and then call tested function. But should I call my create_user() function at setup and then call tested get_user_by_id()? Or it would be better to populate database with some test data and then use it? Should I test this kind of functions at all?
Related
I have some queryset qs = MyModel.objects.filter(fieldname='fieldname') and a custom plpgsql function my_function(), that I can use in pg_shell like:
SELECT my_function('SELECT * FROM my_model');
How to do this with Django ORM? I tried to:
with connection.cursor() as cursor:
query, params = qs.query.sql_with_params()
query = f"SELECT my_function('{query}')"
cursor.execute(query, params)
But qs.query.sql_with_params() is not intended to return valid SQL.
Is something like this even possible with Django ORM?
First of all, if you want to call function/procedure do it like
from django.db import connections
connection = connections['default']
with connection.cursor() as cursor:
cursor.callproc('name_of_your_function', ('SELECT * FROM some_table WHERE whatever_you_want', ))
result = cursor.fetchone()
Next thing is about real sql query from django orm, not poor query that brings queryset.query or qs.query.sql_with_params especially in case you work with dates but the real one that will pass to psycopg
from .models import SomeModel
qs = SomeModel.objects.filter(create_at__gt='2021-04-20')
from django.db import connections
conn = connections['default']
from django.db.models.sql.compiler import SQLCompiler
compiler = SQLCompiler(qs.query, conn, using=qs.using)
sql, params = compiler.as_sql()
i have created 2 table, ie table1 and table2 i want to insert data in to both tables using django ORM , how can i achieve it
models.py
class Table1(models.Model):
name = models.CharField(max_length=20,null=True)
class Table2(models.Model):
name = models.CharField(max_length=20,null=True)
views.py
class Test(ListAPIView):
def get(self,request):
obj1 = Table1(name="jasir")
obj2 = Table2(name="shibin")
obj1.save()
obj2.save()
return Response(True)
im saving like this but i want to save it using single save() instance is there any possiblity
the equivalent sql query i found is
BEGIN TRANSACTION
INSERT INTO Table1 (name) VALUES ('jasir')
INSERT INTO Table2 (name) VALUES ('shibin')
COMMIT TRANSACTION
how to do the same with django ORM
Try making the saves atomic like this:
with django.db.transaction.atomic():
obj1.save()
obj2.save()
you can use Django's transcation.atomic context-manager to do that
Refer to:
https://docs.djangoproject.com/en/2.2/topics/db/transactions/#django.db.transaction.atomic
with transaction.atomic():
# This code executes inside a transaction.
obj1 = Table1(name="jasir")
obj2 = Table2(name="shibin")
obj1.save()
obj2.save()
I'm currently going over a course in Web design. What I want to do is to check if a table named portafolio exists in my database, if not I want to create one.
I'm using Python (flask) and sqlite3 to manage my database.
So far I the some of the logic in SQL to create a table if it doesn't exist:
# db is my database variable
db.execute('''create table if not exists portafolio(id INTEGER PRIMARY KEY AUTOINCREMENT,
stock TEXT,
shares INTEGER,
price FLOAT(2),
date TEXT
''');
But instead of using SQL commands I'd like to know how would I do the exact same checking in Python instead, since it would look a lot cleaner.
Any help would be appreciated.
Not sure about which way is cleaner but you can issue a simple select and handle the exception:
try:
cursor.execute("SELECT 1 FROM portafolio LIMIT 1;")
exists = True
except sqlite3.OperationalError as e:
message = e.args[0]
if message.startswith("no such table"):
print("Table 'portafolio' does not exist")
exists = False
else:
raise
Note that here we have to check what kind of OperationalError it is and, we have to have this "not pretty" message substring in a string check because there is currently no way to get the actual error code.
Or, a more SQLite specific approach:
table_name = "portafolio"
cursor.execute("""
SELECT name
FROM sqlite_master
WHERE type='table' AND name=?;
""", (table_name, ))
exists = bool(cursor.fetchone())
If you are looking for a neat job with Database operations, I advice you learn more about ORM(Object Relation Model).
I use Flask with SQLAlchemy. You can use python classes to manage SQL operations like this:
class Portafolio(db.Model):
id = db.Column(db.Integer, primary_key=True)
stock = db.Column(db.String(255), unique=True)
shares = db.Column(db.Integer, unique=True)
It does all the database checks and migration easily.
I have code that works for getting models and fields from a Django database. However, it only works on the default database.
This function wants a database name, and I'd like to get the tables and fields for that database.
def browse_datasource(request, dbname):
table_info = []
# This is what I'd /like/ to be able to do, but it doesn't work:
# tables = connections[dbname].introspection.table_names()
tables = connection.introspection.table_names()
found_models = connection.introspection.installed_models(tables)
for model in found_models:
tablemeta = model._meta
columns = [field.column for field in model._meta.fields]
table_info.append([model.__name__, columns])
How can I perform introspection on the non-default databases? Is there a correct way to get connection.introspection for a database with the name "example", for example?
I found the solution. The trick is getting the database connection from the connections list, then getting a cursor and passing that to introspection.table_names, like so:
table_info = []
conn = connections[dbname]
cursor = conn.cursor()
tables = conn.introspection.table_names(cursor)
found_models = conn.introspection.installed_models(tables)
for model in found_models:
tablemeta = model._meta
I am creating a django cursor in a views.py
def getDetails(request, name):
sql = 'select * from myTable where name=%s' % name
cursor = connections['default'].cursor()
cursor.execute(sql)
I don't explicitly close the cursor or use context manager (with keyword).
I want to know if the cursors are destroyed or they persist after each request response cycle in django. I am using django 1.4