Creating Database for ios - python

I'm creating an app in which database will be updated via server, or it should be replaced by new one. Can you give me some tips, should I use CoreData or SqlLite. And how it should be updated. I'm using python for parsing data, and then I want to create DB.
Thank you!

This question is not very specific, but if you are not sure what you really need, stay away from CoreData. Start with SQLite and maybe use a light wrapper on top of it. I personally use FMDB.
There are quite a few examples in FMDB on how to select, insert, update etc. Quick example select from the Executing Queries section of the readme:
FMDatabase *db = [FMDatabase databaseWithPath:#"/tmp/tmp.db"];
FMResultSet *s = [db executeQuery:#"SELECT * FROM myTable"];
while ([s next]) {
//retrieve values for each record
}

Related

Python, SQLAlchemy, MySQL: Insert data between existing records

Unfortunately I couldn't find any useful information on this topic. I have an existing database with existing tables and also existing data in it. Now I have to add new data in between the existing data. My code would look something like this, but it doesn't work:
INSERT INTO table_name(data) VALUES('xyz')
WHERE DATETIME(datetime) > DATETIME('2017-01-01 02:00:00');
I have created an image for a better understanding of my question.
Please take notice, that I need the Primary Key to adapt to the made changes as you can see in the picture. My tools are Python, SQLAlchemy and MySQL. I look forward to every help.

Python or SQL: Populating an excel form (multiple times and saving outputs) from another table

Problem: Customer has requested we fill out a form (excel) for each item we provide them. Due to us providing them a large amount of parts, I would like to figure out a way to automate it as much as possible.
Idea: Create a table ('Data') with each part number and relevant information in the columns. Use Python to read 'Data' table, open blank customer form, populate blank customer form, and then save customer form.
Questions:
Can SQL accomplish this task as well? In relation to this task, I've only really created flat table outputs with SQL. Not really sure how this would work.
Recommended Python packages / documentation?
Similar example with code available? Just helps me learn being able to walk through something.
Any other ideas? Maybe I am tackling this issue the wrong way.
I am just unsure of my best path of action.
You could create a simple table on your SQL system (PostgreSQL, MySQL), so you can add modify simply your items.
Then you can export your table in excel format as the customer wants with:
Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',';
You can also do it with python, but i think it's more complicated to update item with python, with a SQL system you could create and HTML/PHP front-end page making it more customizable.

Insert statment created by django ORM at bulk_create

I am kind of new to python and django.
I am using bulk_create to insert a lot of rows and as a former DBA I would very much like to see what insert statments are being executed. I know that for querys you can use .query but for insert statments I can't find a command.
Is there something I'm missing or is there no easy way to see it? (A regular print is fine by me.)
The easiest way is to set DEBUG = True and check connection.queries after executing the query. This stores the raw queries and the time each query takes.
from django.db import connection
MyModel.objects.bulk_create(...)
print(connection.queries[-1]['sql'])
There's more information in the docs.
A great tool to make this information easily accessible is the django-debug-toolbar.

Using Python to Query multiple SQL databases on different servers

I have been doing a fair amount of manual data analysis, reporting and dash boarding recently via SQL and wonder if perhaps python would be able to automate a lot of this. I am not familiar with Python at all so I hope my question makes sense. For security/performance issues, we store databases on a number of servers (more than 5) which contain data that would be pertinent to a query. Unfortunately, these servers are set up so they cannot talk to each other so I cant pull data from the two servers in the same query. I believe this is a limitation due to using windows credentials/security.
For my data analysis and reporting needs, I need to be able to grab pertinent data from two or more of these so the way I currently do this is by running a query, grabbing the results, running another query with the results, doing some formula work in excel, and then running another query and so on and so forth until I get what I need.
Unfortunately this both time consuming, and also makes me pull massive datasets (in the multiple millions of rows), which I then have to continually narrow down based on criteria that are in said databases.
I know Python has the ability to query SQL Server, however I figured I would ask the experts:
Can I manipulate the data in the background with Python similar to how I can do with excel (lookups, statistical functions, etc, perhaps even XML/webAPI?
Can Python handle connections to multiple different database servers at the same time?
Does Python handle windows credentials well?
If Python is not the tool for this, can you name one that would work better?
Please let me know if I can provide additional pertinent details.
Ideally, I would like to end up creating our own separate database and creating automated processes to pull everything from other databases but currently that is not possible due to project constraints.
Thanks!
I didn't use windows credential. But i have used Python to work with multiple MS-SQL databases at the same time. It worked very well. You can use the library pymssql or better with SQLAlchemy
But i think you should start with a basic tutorial about Python first. Because you want to work with millions of rows, it's very important to understand list, set, tuple, dict in Python. For good performance, you should use the right type.
A basic example with pymssql
import pymssql
conn1 = pymssql.connect("Host1", "user1", "password1", "db1")
conn2 = pymssql.connect("Host2", "user2", "password2", "db2")
cursor1 = conn1.cursor()
cursor2 = conn2.cursor()
cursor1.execute('SELECT * FROM TABLE1 LIMIT 10')
cursor2.execute('SELECT * FROM TABLE2 LIMIT 10')
result1 = cursor1.fetchall()
result2 = cursor2.fetchall()
# print each row
for row in result1:
print(row)
# print each row
for row in result2:
print(row)
You can do all of what you asked. Python allows to create multiple connection objects via a library, so for example, let's say you use MySQL python you would create two different objects like this:
NOT ACTUAL CODE, JUST EXAMPLE
conn1 = mysqlConnect(server1, user, pass)
conn2 = mysqlConnect(server2, user, pass)
Like this, conn1 connects to one database and conn2 connects to a different one, usually you would do:
conn1.execute(query_to_server_1)
conn2.execute(query_to_server_2)
This helps maintain two different connections in the same script. If you are looking for multi threading, python offers an incredible library that will help you execute multiple task from one master script.

Python: RE vs. Query

I am building a website using Django, and this website uses blocks which are enabled for a certain page.
Right now I use a textfield containing paths were a block is enabled. When a page is requested, Django retrieves all blocks from database and does re.search on the TextField.
However, I was wondering if it is not a better idea to use a separate DB table for block/paths, were each row contains a single path and reference to a block, in terms of overhead.
A seperate DB table is definitely the "right" way to do it, because mysql has to send all the data from your TEXT fields every time you query. As you add more rows and the TEXT fields get bigger, you'll start to notice performance issues and eventually crash the server. Also, you'll be able to use VARCHAR and add a unique index to the paths, making lookups lightning fast.
I am not exactly familiar with Django, but if I am understanding the situation correctly, you should use a table.
In fact this is exactly the kind of use that DB software is designed and optimized for.
No worries. It will actually be faster.
By doing the search yourself, you are trying to implement part of the DB logic on your own. Fun, certainly, but not so fast. :)
Here are some nice links on designing a database:
http://dev.mysql.com/tech-resources/articles/intro-to-normalization.html
http://en.wikipedia.org/wiki/Third_normal_form
Hope this helps. Good luck. :-)

Categories