localhost django dev server vs. postgres slow on mac os? - python

Anyone notice slowness from a django dev server running on Mac OS and connecting to a remote (postgres) db? It doesn't seem to be the DNS problem referenced elsewhere. We've got a staging instance running the exact same code on the same remote staging box that's hosting the db, and the performance on that instance is very crisp.
Here's the output of the performance middleware running locally:
Total: 19.58 Python: 6.39 DB: 13.19 Queries: 17
And on the staging server:
Total: 0.07 Python: 0.05 DB: 0.02 Queries: 16
Maybe it's postgres client network overhead from connecting to the remote db, or something? I don't mind doing the development on the staging server, but it's nice being able to run things locally too.

Two things:
The Django dev server is painfully slow. Any external connections will be restricted by it.
The connection to an external database is limited by your machine's local upstream and downstream capabilities (the bottleneck usually being your internet connection).
Any time you're developing locally and connecting to an external database server, it will be slow. For concurrent Drupal development at work, we source control our sites folder and use the same database that, though external, never leaves our local network. It's still like molasses in Alaska in January.
I strongly recommend setting up PostgreSQL locally and copying your external database to a local one. It isn't a very time-intensive process and will save you headaches and keep you much more productive.

I faced the same problem when I was using replica of my production database in development environment. The problem turned out to to be in django_session table which had a size near Gigabytes. Simplest solution was to clear that table as I did not need to use users session data in development.
I used simple command:
TRUNCATE TABLE 'django_session'
also additional info on the issue can be found here:
https://dba.stackexchange.com/questions/52733/why-django-session-table-grows-on-postgresql

Related

Would old version of mysql(5.1) caused performance issue in Django(1.11)

I'm rewritting a web application using Django 1.11. When i hooked up to my test mysql database (version 5.7), the performance is amazing. The page renders within 1 second. However, when I connect to the existing production mysql (version 5.1), the page takes more than 10 seconds.
I installed the debug toolbar, and I found out the the query time is actually not the issue. Most of the time are in the CPU.
I've eliminated the possibility of mysql performance issue by directly running the query on the database. It's not network issue either since my ping is just fine.
I am wondering whether the Django is having issues with older mysql version when Django receives the data and try to map it to objects via ORM.

Is it possible to deploy Django with Sqlite?

I've built a Django app that uses sqlite (the default database), but I can't find anywhere that allows deployment with sqlite. Heroku only works with postgresql, and I've spent two days trying to switch databases and can't figure it out, so I want to just deploy with sqlite. (This is just a small application.)
A few questions:
Is there anywhere I can deploy with sqlite?
If so, where/how?
SQLite is a database on the disk, it is very useful for development purposes, however services like Heroku expect your server-side code to be stateless, which as a consequence does not really allow for databases such as SQLite. I guess you could make it work (provided you find a place on Heroku's disk where to put your SQLite db) but you would constantly lose your database's content every time you redeploy.
For Heroku specifically, I'll redirect you to this link which explains how to use Django with PostgreSQL on Heroku.
Don't use SQLite on heroku. As stated in the docs you will lose your data:
SQLite runs in memory, and backs up its data store in files on disk.
While this strategy works well for development, Heroku’s Cedar stack
has an ephemeral filesystem. You can write to it, and you can read
from it, but the contents will be cleared periodically. If you were to
use SQLite on Heroku, you would lose your entire database at least
once every 24 hours.
Even if Heroku’s disks were persistent running SQLite would still not
be a good fit. Since SQLite does not run as a service, each dyno would
run a separate running copy. Each of these copies need their own disk
backed store. This would mean that each dyno powering your app would
have a different set of data since the disks are not synchronized.
Instead of using SQLite on Heroku you can configure your app to run on
Postgres.
sure you can deploy with sqlite ... its not really recommended but should work ok if you have low network traffic
you set your database engine to sqlite in settings.py ... just make sure you have write access to the path that you specify for your database

How to efficiently save data to a remote server database (python)

Currently in my code, there are a few client machines doing processing and one server machine with a database. After an individual client machine finishes processing some data, it saves the data to a .txt file and sftp's it over to the server.
The server has a job that just waits for these txt files and stores the data into a database.
I wanted to know of any other efficient processes for this, kinda a python beginner. Is there a way to remotely save data into the database of the server? How to do so securely, etc?
To be more specific, this project is a webapp hosted in django. I know how to use django's standalone scripts to save data into a db, just preferably need to know how to do so remotely.
Thank you.
Django databases can be remote - there is no requirement they be on the same host at the django server. Just set an appropriate HOST and PORT. See: https://docs.djangoproject.com/en/dev/ref/databases/#id10
Update: Based on your comment, I understand that you want to write python/django code that will run in the browser, and connect to a remote database. There is no practical way of doing this. Have the data sent back to your server, and forward it on from there.
Update 2: If you are able to distribute software outside of the browser, you could have a small django deployment on each client computer, which the user connects to through their browser, which could connect directly to the database. Obviously, security considerations apply.

Python memcached?

i've been working with php for a while now and have found the memcached library for php.
Now i have a search engine in python and i want to cache content from databases written for python interpretation...
I am aware of the mod_python library in apache server.. can i use it some-how to cache stuff?
i mean, once my cache is warm, it should stay warm, as long as the server is powered on.
(Of course, if the server is powered off, the memory is refreshed hence memcached will be refreshed too)
My homework:
I've come across this page: python memcached but i'm not sure if i can use it in conjunction with apache server...
thanks to all who helped...
https://pypi.python.org/pypi/python-memcached/ :
This software is a 100% Python interface to the memcached memory cache daemon. It is the client side software which allows storing values in one or more, possibly remote, memcached servers.
That means that memcached runs as a separate daemon, independent from httpd, and your code can store data in that daemon like in a database. In fact, memcached is nothing else than a NoSQL database.
For a tutorial, see Good examples of python-memcache (memcached) being used in Python? .
Apache has completely nothing to do with this - your code makes its own connection to the daemon.

(Python/Django): How do I keep my production db in sync (scheme and data) and with dev pc db?

I have a local Postgres database which will be filled with data (daily) on my local development machine. What is a good solution to transfer/sync/mirror this data to a production Postgres database.
For what it's worth I'm developing in Python using Django.
Thanks!
This seems like a strange workflow for me. Wouldn't it be much better to import the data in the production database and then just sync it with your development db from time to time?
IMO, the development machine shouldn't be included in the production data workflow.
That's the way I do it using fabric. I've written a simple function which copies part of the production db onto the local development machine.
South is a great tool for dealing with database migrations in Django projects. The latest release now supports both schema and data migrations
http://south.aeracode.org/docs/tutorial/part3.html#data-migrations
The app provides a number of management commands which allow you to dump executable files which when run can alter the database schema or insert records. It's great for automating changes to a production environment or when working on a team. You could then use something like fabric (or do it manually if you must) to push up the migration files and run the migrate command to populate your database

Categories