In python, can we set table name as a variable using Sqlite3? - python

For example:
import Sqlite3
def ChangeTable(c,a):
c.execute('''DELETE FROM MY_TABLE WHERE id = ?''',(a,))
This way I can change the value of a and process the database with a function in python.
But how can I do something similar with Table names?
This way I can use one function to handle different tables.

Parsing in table names is made to not work intentionally, since dynamically using table names in SQL queries is generally a bad idea. If you're in a situation where you end up dynamically wanting to do stuff with table names, you should think about redesigning your database and make it relational. It's hard to give specific advice about how to do this in your case, since we don't know how your database is arranged.
The other answer involves using string formatting in SQL queries. This is very bad coding practice, since it makes your code vulnerable to SQL injection. The sqlite documentation says to never do this, in fact. You don't want to have a Bobby Tables situation on your hands.

You can't use variables for table names. You have to perform string concatenation or substitution. As an example, using an F-string (Python >= 3.6):
def change_table(table_name, id):
c.execute(f'DELETE FROM {table_name} WHERE id = ?', (id,))
with more meaningful variable names...
Triple quoting is not required here but useful for multiline statements.

Related

What if I *really* need to escape quotes for an SQL script?

Please don't reach for the "duplicate" gun just yet.
I need to generate a series of SQL statements involving literal strings that contain the occasional single quote. Yeah, I know that parametrized queries are the way to go. The thing is, I'm not communicating with a database directly: I'm generating an SQL script that will be used to load data on another computer. So, I don't mind issuing parametrized queries to my local database (mysql), but I'll need to output the complete SQL commands in text form. Can I do that in python? The only suggestions I saw on SO are hacks like using repr() or json.dumps(), or specific to psycopg. Surely that can't be all there is?
This application will never deal with untrusted data, so although I'd like the most robust solution possible, I'm not too worried about malformed unicode attacks and the like. What's the best way to do this?
You can subclass the psycopg2 cursor class to use mogrify to write the queries to a file instead of executing them against a database. You could probably also use it directly (and save you setting up a database etc.).
You could also use the query attribute for "playback" of a session.
While generating complex SQLs may have it's difficulties, I love python for straight INSERTs. No tools, no libraries, just plain python solves all issues out of the box:
# syntactic sugar
def sq(s):
return s.replace("'", "''")
# a third kind of string delimiters :3
sql_template = """INSERT INTO mytab ("the key", f1, f2)
VALUES (%d, '%s', '%s');"""
the_key = 7
f1 = "Hello World"
f2 = "That's ok"
# escape only string colums
sql = sql_template % (the_key, sq(f1), sq(f2))
print sql
The process for producing SQL is the same for any language. You need to first understand the entirety of the language (or data format; same difference here) that you want to produce. Failure to do this is the primary failing of most attempts: since they don't understand the language they don't handle all the input that can cause a problem. When you know the entirety of the language, including all corner cases and rarely used constructs, then you can create code that can understand all possible inputs and correctly represent them in that language. The easiest solution is to let someone else who has already dealt with it do the hard part, or to sidestep the entire mess by using a different representation. This is why everyone recommends parameterized queries. It pushes the responsibility to someone else who has already solved it, and usually they solve it by using a different representation in the protocol than the SQL language itself.

how to store a SQL-database in a python object, and perform queries in the object?

I have a big postgrSQL database. I would like to somehow store the full database in a python object, which form/structure would reflect the one of the database. Namely I imagine something like
* An object database, with an attribute .tables which a kind of list of object "table", and a table object has an attribute "list_of_keys" (list of the column names) and an attribute "rows", which reflects all the rows of the corresponding table in the database.
Now, the main point i need is: i want to be able to perform a search in the database object, with exactely the same SQL synthax that i would use in the corresponding SQL database. Thus something like
database.execute("SELECT * FROM .....")
where, i repeat, "database" is a purely python object (which was filled with data coming from an SQL database, but which is now independent of it).
My aim is: i want to be able to apply the same algorithm either on a SQL-Database, or on a python-object, such as described above, without changing my code. So, i imagine, let "database" be either a usual database-connector/cursor (like with psycopg, for example), or a python object as i described, and the same piece of code
database.execute("SELECT BLABLABLA")
would work in both cases.
Is there any known module which allows that ?
thanks.
It might get a bit complicated, but take a look at SQLite's in-memory storage:
import sqlite3
cnx = sqlite3.connect(':memory:')
cnx.execute('CREATE TABLE ...')
There are some differences in the SQL syntax, but the basic stuff works fine. This might also take a good amount of RAM, depending on your data.

pymssql and placeholders

What placeholders can I use with pymssql. I'm getting my values from the html query string so they are all of type string. Is this safe with regard to sql injection?
query = dictify_querystring(Response.QueryString)
employeedata = conn.execute_row("SELECT * FROM employees WHERE company_id=%s and name = %s", (query["id"], query["name"]))
What mechanism is being used in this case to avoid injections?
There isn't much in the way of documentation for pymssql...
Maybe there is a better python module I could use to interface with Sql Server 2005.
Thanks,
Barry
Regarding SQL injection, and not knowing exactly how that implementation works, I would say that's not safe.
Some simple steps to make it so:
Change that query into a prepared statement (or make sure the implementation internally does so, but doesn't seem like it).
Make sure you use ' around your query arguments.
Validate the expected type of your arguments (if request parameters that should be numeric are indeed numeric, etc).
Mostly... number one is the key. Using prepared statements is the most important and probably easiest line of defense against SQL injection.
Some ORM's take care of some of these issues for you (notice the ample use of the word some), but I would advise making sure you know these problems and how to work around them before using an abstraction like an ORM.
Sooner or later, you need to know what's going on under those wonderful layers of time-saving.
Maybe there is a better python module I could use to interface with Sql Server 2005.
Well, my advice is using an ORM like SqlAlchemy to handle this.
>>> from sqlalchemy.ext.sqlsoup import SqlSoup
>>> db = SqlSoup('mssql:///DATABASE?PWD=yourpassword&UID=some_user&dsn=your_dsn')
>>> employeedata = db.employees.filter(db.employees.company_id==query["id"])\
.filter(db.employees.name==query["name"]).one()
You can use one() if you want to raise an exception if there is more than one record, .first() if you want just the first record or .all() if you want all records.
As a side benefit, if you later change to other DBMS, the code will remain the same except for the connection URL.

Python/SQLITE: How do I escape commands/strings

I would like to construct a sqlite3 database query in python, then write it to a file.
I am a huge fan of python's interfaces for sql databases, which AFAICT wrap all calls you could mess up with a nice little '?' parameters that sanitizes/escapes your strings for you, but that's not what I want. I actually just want to prepare and escape a sql statement - to do this, I need to escape/quote arbitrary strings.
For example:
query = "INSERT INTO example_table VALUES ('%s')",sqlite_escape_string("'")
And so query should contain:
"INSERT INTO example_table VALUES ('''')"
Note that it inserted an additional ' character.
PHP's equivalent is sqlite_escape_string()
perl's equivalent is DBI's quote()
I feel Python has a better overall interface, but I happen to need the query, pre-exec.
When you use SQLite it doesn't turn parameterized queries back into text. It has an api ("bindings") and stores the values separately. Queries can be reused with different values just by changing the bindings. This is what underlies the statement cache. Consequently you'll get no help from python/sqlite in doing what you describe.
What you didn't describe is why you want to do this. The usual reason is as some form of tracing. My alternate Python/SQLite interface (APSW) provides easy tracing - you don't even have to touch your code to use it:
https://rogerbinns.github.io/apsw/execution.html#apsw-trace
SQLite also has an authorizer API which lets you veto operations performed by a statement. This also has a side effect of telling you what operations a statement would end up performing.

Python and Postgresql

If you wanted to manipulate the data in a table in a postgresql database using some python (maybe running a little analysis on the result set using scipy) and then wanted to export that data back into another table in the same database, how would you go about the implementation?
Is the only/best way to do this to simply run the query, have python store it in an array, manipulate the array in python and then run another sql statement to output to the database?
I'm really just asking, is there a more efficient way to deal with the data?
Thanks,
Ian
You could use PL/Python to write a PostgreSQL function to manipulate the data.
http://www.postgresql.org/docs/current/static/plpython.html
Although tbh I think it's much of a muchness compared to processing it in an external client for most cases.
I'm not sure I understand what you mean, but I'd say it sounds very much like
INSERT INTO anothertable SELECT stuff FROM the_table RETURNING *
and then work on the returned rows. That is, of course, if you don't want to modify the data when you manipulate it.
Is the only/best way to do this to
simply run the query, have python
store it in an array, manipulate the
array in python and then run another
sql statement to output to the
database?
Not the only way (see the other answers) but IMHO the best and certainly the simplest. It just requires a PostgreSQL librray (I use psycopg). The standard interface is documented in PEP 249.
An example of a SELECT with psycopg:
cursor.execute("SELECT * FROM students WHERE name=%(name)s;",
globals())
and an INSERT:
cursor.execute("INSERT INTO Foobar (t, i) VALUES (%s, %s)",
["I like Python", 42])
pgnumpy seems to be what you're looking for.
I'd think about using http://www.sqlalchemy.org/.
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that gives application developers the full power and flexibility of SQL.
It supports postgres, too - http://www.sqlalchemy.org/docs/05/reference/dialects/postgres.html
You could use an ORM such as SQLAlchemy to retrieve the data into an "object", and manipulate that object. Such an implementation would probably be more elegant than just using an array.
I agree with the SQL Alchemy suggestions or using Django's ORM. Your needs seem to simple for PL/Python to be used.

Categories