I'm having issues making multiple ajax POST calls to functions that access a database in my web app using sqlite and mod-wsgi. I have no issues making requests to one function, but as soon as I call a different function, I started getting "database is locked" errors. I've tried setting the variables as global and just accessing them in the two functions, as well as opening and closing the database in each function, to no avail.
What's the proper way to interface with a database if you just have one application function in your code? Threads? Persistent connections?
I've used Django before, but wanted something bare-bone for this simple app running on my local machine.
The relevant section of code is:
con = sqlite3.connect("/var/www/Knowledge/eurisko.sqlite")
con.row_factory = sqlite3.Row
cursor = con.cursor()
cursor.execute("update notes_content set c1content=?, c2timestamp=?
where c0title=?", [content, timestamp, title])
con.commit()
cursor.close()
con.close()
The full file is here: http://pastebin.com/7yuiZFi2
I'm running apache 2.2 on ubuntu 10 with libapache2-modwsgi and python
2.7.
See warnings about concurrent access from multiple processes in SQLite documentation.
http://www.sqlite.org/faq.html#q5
This information was provided on mod_wsgi list where question was also asked, but following up here.
This can be an issue because Apache/mod_wsgi supports both single and multi process configurations. Likely OP is using multi process configuration. Also see:
http://code.google.com/p/modwsgi/wiki/ProcessesAndThreading
for description of Apache/mod_wsgi process/threading model.
Related
I am using SQlite db, two application uses this db, one from C++ and another from python. If a table is accessed by both the application simultaneously, Database error occurs. How to use same db in two application. Thanks in advance.
I am trying to integrate alongside an existing application that uses ADS as its database.
When i connect my integration app using the code below it connects fine until i try and run the original application at the same time. It seems to only allow one connection, my application seems to hold the connection and block all others. Yet i can have multiple instances of the original application running conncurrently with no issue. Which leads me to believe that its the way in which i am trying to correct from my c# app. The error im getting when the original app is open and i then try to connect with my integration app is "The Advantage Data Dictionary cannot be opened. axServerConnect" .
Error 7077: The Advantage Data Dictionary cannot be opened. axServerConnect
Anyone any suggestions? How to create a multiple connection at same time?
Python code:
conn = adsdb.connect(DataSource=str(dbpath[0]), ServerType='local',
UserID = config.ADS_USERNAME, password=config.ADS_PASS)
According to this page in ADS documentations, you can use connection pooling by providing pooling=True to your client connection arguments.
I think using this approach, you will be able to open multiple connections at the same time.
Edit
After checking adsdb python script, I think it does not support connection pooling. You probably be able to set that connection pooling in your C# application.
I'm currently working on a django application. I can't add an element to my database on the admin view. I fill all the information but when I click on save button but the operation doesn't finish and I get a timeout. I use sqlite3 as database.
My question is there any one that know the origin of this problem. If not how could I investigate the problem. When I worked with other language (Java, C ...etc) when I have a problem I can use a debugger. What are the options I have?
This Problem can occur because of following reasons:
(Less Probable) You computation code is too Slow: Which is a rarity because the Timeout is set to about 1 minute or so, and code doesn't take that time to Execute
Your app is waiting on some external resource but it is not Responding. For this you will have to check for the Django logs and check if some external resource error is there
(Most Probable) Database taking too much time: This can occur either because:
App can't connect to Database: For this you have to check database logs OR try and connect manually with database through python manage.py dbshell
DB Query Taking so much time to execute: You can test this by checking database logs for how much time a query is taking OR you can connect manually via dbshell and make the same query there
Your can also use tools Like django-profiler , Django debug toolbar etc for debugging purposes. and for native python code python debugger
I was just putting the finishing touches to a site built using web.py, MySQL and python-mysql (mysqldb module) feeling good about having projected from sql injections and the like when I leant on the refresh button sending 50 or so simultaneous requests and it crashed my server! I reproduced the error and found that I get the following two errors interchangeably, sometimes its one and sometimes the other:
Error 1:
127.0.0.1:60712 - - [12/Sep/2013 09:54:34] "HTTP/1.1 GET /" - 500 Internal Server Error
Exception _mysql_exceptions.ProgrammingError: (2014, "Commands out of sync; you can't run this command now") in <bound method Cursor.__del__ of <MySQLdb.cursors.Cursor object at 0x10b287750>> ignored
Traceback (most recent call last):
Error 2:
python(74828,0x10b625000) malloc: *** error for object 0x7fd8991b6e00: pointer being freed was not allocated
*** set a breakpoint in malloc_error_break to debug
Abort trap: 6
Clearly the requests are straining MySQL and causing it to fall over so my question is how do I protect against this happening.
My server setup is setup using Ubuntu 13.04, nginx, MySQL (which I connect to with the mysqldb python module), web.py and fast-cgi.
When the web.py app starts up it connects to the database as so:
def connect():
global con
con = mdb.connect(host=HOST, user=USER, passwd=PASSWORD, db=DATABASE)
if con is None:
print 'error connecting to database'
and the con object is assigned to a global variable so various parts of the application can access it
I access the databse data like this:
def get_page(name):
global con
with con:
cur = con.cursor()
cur.execute("SELECT `COLUMN_NAME` FROM `INFORMATION_SCHEMA`.`COLUMNS` WHERE `TABLE_SCHEMA`='jt_website' AND `TABLE_NAME`='pages'")
table_info = cur.fetchall()
One idea I had was to open and close the database before and after each request but that seems overkill to me, does anybody have any opinions on this?
What sort of methods do people use to protect their database connections in python and other environments and what sort of best practices should I be following?
I don't use web.py but docs and tutorials show a different way to deal with database.
They suggest to use a global object (you create it in .connect) which probably will be a global proxy in the Flask style.
Try organizing your code as in this example ←DEAD LINK and see if it happens again.
The error you reported seems a concurrency problem, that normally is handled automatically by the framework.
About the latter question:
What sort of methods do people use to protect their database connections in python and other environments and what sort of best practices should I be following?
It's different depending on the web framework you use. Django for example hides everything and it just works.
Flask lets you choose what you want to do. You can use flask-sqlalchemy which uses the very good SQLAlchemy ORM managing the connection proxy for the web application.
I am facing a problem where data from MySQL retrieved using PySQLPool is returned as the db was at start of process, INSERT or UPDATE queries from python or MySQL client do not show up until a kill and re-run of the python process.
Would appreciate any help regarding this.
Ref: Why are some mysql connections selecting old data the mysql database after a delete + insert?
MySQL's isolation level was causing this. Somehow only python clients get affected and never stumbled across this issue earlier. It is a valid problem and has a detailed solution. My question was targeting python and pySQLPool because it did not occur to me that MySQL could be the one causing this. Now my deployment procedure includes details on altering global isolation level for MySQL to be "READ-COMMITTED".
SET GLOBAL tx_isolation='READ-COMMITTED';