Capturing Sybase Showplan programmatically - python

I'm using sqlalchemy in python to execute my sql queries. I have prefixed my sql queries with a show plan on. However I am not getting the plan back from my results. Does anyone know if the results for the plan is stored somewhere in some system table or is there some flag that needs to be enabled for sqlalchemy DB API to capture the plan?
Just to re-iterate I'm running against a sybase database

It might be worth trying
set noexec on
... as well to see if a query plan comes back.

Related

How to approach doing a full time load for my data in Oracle to MariaDB?

I'm not sure how to go about doing a one time load of the existing data I have in Oracle to MariaDB. I have DBeaver which I am using to access the databases. I saw an option in DBeaver to migrate the data from Source (Oracle) to Target (MariaDB) with a few clicks, but I'm not sure if that's the best approach.
Is writing a python script a better way of doing it? Should I download another tool to do a one time load? We are using CData Sync to do the incremental loads. Basically, it copies data from one database to another (Oracle to SQL Server for example) and it does incremental loads. I'm not sure if I can use it to do a full time/one time load of all the data I have in my Oracle database to MariaDB. I'm new to this, I've never loaded data before. The thing is, I have over 1100 tables so I can't manually write the schema for each table and do a "CREATE TABLE" statement for all 1100 tables...
Option 1 DBeaver
If DBeaver is willing to try it in a few clicks I'd try and see what it gives for some small tables.
Option 2 MariaDB connect
Alternately there MariaDB connect engine using ODBC or JDBC.
Note you don't need to create table structure for all, but do need the list of table and generate CREATE TABLE t1 ENGINE=CONNECT TABLE_TYPE=ODBC tabname='T1' CONNECTION='DSN=XE;.. for each table.
Then it would be:
create database mariadb_migration;
create table mariadb_migration.t1 like t1;
insert into mariadb_migration.t1 select * from t1;
Option 3 MariaDB Oracle Mode
This uses the Oracle compatibility mode of MariaDB.
Take a SQL dump from Oracle.
Prepend SET SQL_MODE='ORACLE'; to start of the dump.
Import this to MariaDB.
Option 4 SQLines
SQLines offer a Oracle to MariaDB
Small disclaimer, I've not done any of these personally, I just know these options exist.

Autocomplete SQL query in db.execute('') on VScode

When writing SQL query in python (Flask, if that's necessary) execute(), is there a setting or extensions that would recognize SQL keywords like SELECT, UPDATE, and suggest them with IntelliSense or the like?
Right now the query is recognized as in the picture and keywords are not being suggested.
SQL query keywords in VScode are not recognized (the whole query is green)
No, because you're just putting in a string into execute() that is later read by SQLAlchemy (which I assume you're using). These aren't actually python keywords which IntelliSense can predict. However, you can use the SQLAlchemy ORM, or the higher level query object, which does not use SQL keywords but python methods to manipulate your database. Using this, you might find that IntelliSense can find the definition/declaration of the SQLAlchemy method and offer the usual pointers and helpers it does.
There are other advantages of using the higher level query class of SQLAlchemy, a significant one being you are less likely to be subject to SQL injections and attacks. Because you are executing raw SQL with the execute() command and simply putting an id from the session in, an attacker could alter the session value and inject harmful SQL into your application.
Anyway, that's beside the point but I thought it was worth letting you know.

Preventing SQL Injection for online SQL querying

I have a small Python project site where a visitor can practice writing SQL code. This code actually runs and returns values. I know that I need to prevent SQL injection, but I'm not sure the best approach since the purpose of the site is that users should be able to write and execute arbitrary SQL code against a real database.
What should I look to do to prevent malicious behavior? I want to prevent statements such as DROP xyz;, but users should still be able to execute code. I think maybe the ideal solution is that users can only "read" from the database, ie. they can only run SELECT statements (or variations). But I'm not sure if "read only" captures all edge cases of malicious behavior.
Need to prevent malicious SQL querying, but also need to allow users to execute code
Using SQLite now but will probably move to postgres
I'm strictly using SQL at this point but may want to add Python and other languages in the future
The site is built with Python (Flask)
Any ideas or suggestions would be helpful
There is no way to prevent SQL injection for a site that takes SQL statements as user input and runs them verbatim. The purpose of the site is SQL injection. The only way you can prevent SQL injection is to not develop this site.
If you do develop the site, how can you prevent malicious SQL? Answer: don't let malicious users have access to this site. Only allow trusted users to use it.
Okay, I assume you do want to develop the site and you do want to allow all users, without doing a lot of work to screen them.
Then it becomes a task of limiting the potential damage they can do. Restrict their privileges carefully, so they only have access to create objects and run queries in a specific schema.
Or better yet, launch a Docker container for each individual to have their own private database instance, and restrict the CPU and memory the container can use.

Python SQL connections

How to dump/load data from python test response to a database table(SQL)?
Assuming I know nothing, can you guide me or provide all the possible ways to dump/load/store data from a pytest response to a SQL table
Below are the high level steps you should take to load data into a SQL database. The lack of context makes it impractical to go into further detail.
Set up a database (choose one, install it, configure it).
(Usually) change the database schema to suit your needs. (Could also happen after #3.)
Connect to the database from wherever you have the data.
Insert the data into the database.
Maybe this example will help.
I don't know what you mean by "responses".

Storing/Copying PostgreSQL Database to Another Server through SQLAlchemy

I know there are ways of storing data/tables from one server to another, such as the instruction provided here. However, due to I use python to scrape, create, and store data, I am wondering that whether I could fulfill this process by directly using SQLAlchemy. More precisely, after I store the scraped data in the database I create through SQLAlchemy in my own computer, can I simultaneously store.copy those database/tables to another computer/server directly through SQLAlchemy? Can anyone help? Thanks so much.

Categories