Using web2py for a user frontend crud - python

I was asked to port a Access database to MySQL and
provide a simple web frontend for the users.
The DB consists of 8-10 tables and stores data about
clients consulting (client, consultant,topic, hours, ...).
I need to provide a webinterface for our consultants to use,
where they insert all this information during a session into a predefined mask/form.
My initial thought was to port the Access-DB to MySQL, which I have done
and then use the web2py framework to build a user interface with login,
inserting data, browse/scroll through the cases and pulling reports.
web2py with usermanagment and a few samples views & controllers and
MySQL-DB is running. I added the DB to the DAL in web2py,
but now I noticed, that with web2py it is mandatory to define every table
again in web2py for it being able to communicate with the SQL-Server.
While struggeling to succesfully run the extract_mysql_models.py script
to export the structure of the already existing SQL DB for use in web2py
concerns about web2py are accumulating.
This double/redundant way of talking to my DB strikes me as odd and
web2py does not support python3.
Is web2py the correct way to fulfill my task or is there better way?
Thank you very much for listening/helping out.

This double/redundant way of talking to my DB strikes me as odd and web2py does not support python3.
Any abstraction you want to use to communicate with your database (whether it be the web2py DAL, the Django ORM, SQLAlchemy, etc.) will have to have some knowledge of the database schema in order to construct queries.
Even if you programmatically generated all the SQL statements yourself without use of an ORM/DAL, your code would still have to have some knowledge of the database structure (i.e., somewhere you have to specify names of tables and fields, etc.).
For existing databases, we aim to automate this process via introspection of the database schema, which is the purpose of the extract_mysql_models.py script. If that script isn't working, you should report an issue on Github and/or open a thread on the web2py Google Group.
Also, note that when creating a new database, web2py helps you avoid redundant specification of the schema by handling migrations (including table creation) for you -- so you specify the schema only in web2py, and the DAL will automatically create the tables in the database (of course, this is optional).

Related

Writing a Django backend program that runs indefinitely -- what to keep in mind?

I am trying to write a Django app that queries a remote database for some data, performs some calculations on a portion of this data and stores the results (in the local database using Django models). It also filters another portion and stores the result separately. My front end then queries my Django database for these processed data and displays them to the user.
My questions are:
How do I write an agent program that continuously runs in the backend, downloads data from the remote database, does calculations/ filtering and stores the result in the local Django database ? Particularly, what are the most important things to keep in mind when writing a program that runs indefinitely?
Is using cron for this purpose a good idea ?
The data retrieved from the remote database belong to multiple users and each user's data must be kept/ stored separately in my local database as well. How do I achieve that? using row-level/ class-instance level permissions maybe? Remember that the backend agent does the storage, update and delete. Front end only reads data (through http requests).
And finally, I allow creation of new users. If a new user has valid credentials for the remote database the user should be allowed to use my app. In which case, my backend will download this particular user's data from the remote database, performs calculations/ filtering and presents the results to the user. How can I handle the dynamic creation of objects/ database tables for the new users? and how can I differentiate between users' data when retrieving them ?
Would very much appreciate answers from experienced programmers with knowledge of Django. Thank you.
For
1) The standard get-go solution for timed and background task is Celery which has Django integration. There are others, like Huey https://github.com/coleifer/huey
2) The usual solution is that each row contains user_id column for which this data belongs to. This maps to User model using Django ORM's ForeignKey field. Do your users to need to query the database directly or do they have direct database accounts? If not then this solution should be enough. It sounds like it your front end has 1 database connection and all permission logic is handled by the front end, not the database itself.
3) See 2

Django postres methods for external Database

I am working in a Django app for investors. Currently using a database and 3 installed apps configured in the settings.py.
I am about integrating a new feature for which every broker will register their IP in our app, so that we will replicate our postgres database in their server (Everything is same except 'HOST' regarding with database) manually. Then broker will send GET and POST methods to our server from their server.
I need to switch the database based on the request coming. I think I can connect their postgres database dynamically by looking the request and process by SQL queries. My requirement is, I just need to use Django postgres methods for processing without configuring the database in settings file.
if configuring database in settings is the only way, how can I switch to database every time efficiently and how many databases can be connected in a single Django app?
I believe if you want to use Django methods (and not simply use RAW SQL queries and parse them) you will have to use the settings.py method and define all your databases there.
https://docs.djangoproject.com/en/1.7/topics/db/multi-db/
In short, you define a database and can manually chose it in your code via (as per docs):
Author.objects.using(database_name_variable).filter(...)
An alternative would be to look at using REST api (like Tastypie) to make calls to different Django instances connected to each database.

Storing session data in database in Pyramid using beaker

I'm building a web application in Pyramid and it needs user logins. Database backend is a MySQL DB connected via SQLAalchemy.
Pyramid has an introduction on using beaker for sessions, but it only shows how to configure it using files. I couldn't find out how to store session data in the database (I think it should be possible), since then I would have only one place were my varying data is stored.
I found it. Put something like this in your configuration file (development.ini/production.ini)
session.type=ext:database
session.secret=someThingReallyReallySecret
session.cookie_expires=true
session.key=WhatEver
session.url=mysql://user:password#host/database
session.timeout=3000
session.lock_dir=%(here)s/var/lock
I don't is if it is possible (or sensible) to put locking to the DB, too, but the sessions should live in the DB like this. You'll need to take care to delete old sessions from the DB yourself (but I think that's the case when using files, too).

BlueBream framework and multi connections to databases

I put the first steps in BlueBream framework. In my project I must get data from RDBMS - MySQL, PostgreSQL and MS Server. For now, I made a simple tutorial helloworld :) I know how to write Interfaces and Implementations, etc.
My question is: How to set up a connections to RDBMS and multiple connections? Could you give me a simple "step by step" tutorial ?
How to bake a database? Is in BlueBream something like a command "syncdb" in Django?
Take a look at zope.sqlalchemy; it integrates SQLAlchemy into the Zope transaction manager. SQLAlchemy in turn lets you access many different databases, including MySQL, PostgreSQL and MS Server.
The zope.sqlalchemy package explains how to obtain a SQLAlchemy session. From there on out you use bog-standard SQLAlchemy operations for which there are plenty of tutorials and help here on SO. SQLAlchemy can also take care of setting up the database schema for you, if that is needed.

How to separate read db server and write db server on django 0.97?

I am using Django 0.97 version with postgresql 9.0. I have configured hot streaming replication on a master db server and a slave db server. My application has heavy bot-driven writes on the DB and reads only from the users. So, it makes it very optimized if I make the read-access slave db for the users' and write-access master db for the bot write access. Unfortunately only Django 1.2 has multiple database support and its a huge effort to upgrade in my application. I got some leads through the following link : http://www.eflorenzano.com/blog/post/easy-multi-database-support-django/ However, this also requires me to change all instances of db access in my application. Is there any simpler way to assign separate db servers for read access and write access by fiddling with the the django core db module?
Your best bet is to upgrade to 1.2 as it will be significantly less work than hacking together features that already exist. If you stick with 0.97 for much longer your life will only be more difficult down the road.
I'm guessing you may have some misconceptions on how using multiple DBs works in Django 1.2. You do not have to "change all instances of db access in [your] application" if you use the Database Routers feature of Django.
With a router, you can specify which database to use for reads and writes. All of your existing django models should work and begin sending requests to the proper database. It's pretty simple to set up a router, just check the docs. All that is required is to create the router class, put it somewhere, then add a line in your settings.
It works really nicely and is not as much work as you may expect. You may have other issues with upgrading that you aren't telling us, but as far as models go you shouldn't have many problems.

Categories