How to hide SQL Plus password in Python script - python

I created Python script which truncate Oracle table. I use SQL Plus, but the problem is that I have to hide password which is plain text now. I have arguments like these:
db_name = "DB_NAME"
db_user = "DB_USER"
db_password = "DB_PASS"
Then I run command like:
sqlplus_delete_table = 'echo "TRUNCATE TABLE ' + db_user + '.' + table + ' DROP STORAGE;"'
sqlplus_connection = db_user + '/' + db_password + '#' + db_name
os.system(sqlplus_delete_table + ' | sqlplus -s ' + sqlplus_connection)
Everything works fine, but the problem is password. As I know, SQL Plus does not use jceks files. So what are other solutions to hide password?

You can use a solution like Django's SECRET_KEY, which I store in a file that is not in the project repository. From this file I load the keys like this in settings.py:
with open(os.path.join(ROOT_DIR, 'etc/secret_key.txt')) as f:
SECRET_KEY = f.read().strip()
In the above example the contents of the text file is just the key, but you can use structured formats such as JSON, YAML, or even a Python file and import it.
Example of Python secret file:
# secret.py
DB_PSSWD='pswd'
DB_USER='user'
In your source code simply:
import secret
print(DB_USER)
Example of YAML secret file:
# secret.yaml
db_psswd: pswd
db_user: user
In your source code simply:
import yaml
with open('secret.yaml') as yaml_secret:
rules = yaml.load(cfg)
print(rules['db_user'])

On Linux it's possible to create bash-script like:
# sql.env
export db_PSSWD='pswd'
export db_USER='user'
Before running python, run bash-script to initialize environment variables:
source sql.env
Then, in python:
db_psswd = os.environ.get("db_PSSWD")
db_user = os.environ.get("db_USER")

Related

Reason why secure-file-priv does not recognize files in its assigned folder?

I am trying to load a .csv file through mysql.connector in python. Here is the relevant portion of the script:
import os
print(os.listdir(r'C:\ProgramData\MySQL\MySQL Server 8.0\Uploads')) #returns test.csv
import_file = 'C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\test.csv'
cnx = mysql.connector.connect(user=usr,
password=passwd,
host='127.0.0.1',
database='ctr')
cursor = cnx.cursor()
load_data = (
"LOAD DATA "
"INFILE "
"'{}' "
"INTO TABLE import "
"FIELDS TERMINATED BY ',' "
"""OPTIONALLY ENCLOSED BY '"' """
"LINES TERMINATED BY '\\n' "
"IGNORE 1 LINES".format(import_file))
securefile_query = ("SHOW VARIABLES LIKE 'secure_file_priv'")
cursor.execute(securefile_query)
directory = cursor.fetchall()
print(directory) #returns C:\ProgramData\MySQL\MySQL Server 8.0\Uploads
print(import_file)
print(load_data)
cursor.execute(load_data)
Traceback of the error that follows points to cursor.execute(load_data) in my script, and the last line of the traceback is "mysql.connector.errors.DatabaseError: 1290 (HY000): The MySQL server is running with the --secure-file-priv option so it cannot execute this statement".
According to the secure_file_priv documentation:
If empty, the variable has no effect. This is not a secure setting.
If set to the name of a directory, the server limits import and export operations to work only with files in that directory. The directory must exist; the server does not create it.
If set to NULL, the server disables import and export operations.
I have confirmed multiple times that the file is in the correct directory. I have also confirmed the value of secure_file_priv using "SHOW VARIABLES". Doesn't this mean MySQL ought to be able to read test.csv from the Uploads folder? How can I read the file without disabling secure_file_priv?
Edit 17Jan2022:
Since I'm running the server locally and have the files locally, I ended up using LOAD DATA LOCAL INFILE as a workaround by adding "allow_local_infile=True" as a parameter inside cnx and then moving the source file to the same directory as the script. Setting secure-file-priv to an empty string (C:\ProgramData\MySQL\MySQL Server 8.0\my.ini) also worked.
I still don't know why I was receiving "server is running with the --secure-file-priv option" given the previous location of my files.

My Python file connected through the right path with my sqlite database but couldn't identify a table in that database

I want to connect my SQLite database/ or I think I'm connecting my "data.sqlite" (the blue highligted one in the picture below) with my views.py in the core folder (picture below) with:
rPath='../../Upchanges/data.sqlite'
conn = _sqlite3.connect(rPath, check_same_thread=False)
c = conn.cursor()
Path of the database I want to connect with:
Upchanges_desperate/Upchanges/data.sqlite
Path of the Python file I want to connect with the database:
Upchanges_desperate/Upchanges/core/views.py
Picture of my file orientation:
So this is my problem:
I later use these codes below in my python file:
from Upchanges.models import BlogPost( I set the __tablename__ to 'blog_post' in another python file)
#core.route('/', methods=['GET', 'POST'])
def index():
search = Blogsearch_form(request.form)
if request.method == 'POST':
c.execute("SELECT * FROM blog_post WHERE problem_name LIKE(?)", ('%' + str(search) + '%',))
results = c.fetchall()
print(results)
return render_template('blog_search_result.html', results=results)
page = request.args.get('page',1,type=int)
many_posts = BlogPost.query.order_by(BlogPost.date.desc()).paginate(page=page, per_page=10)
return render_template('index.html', many_posts=many_posts, form=search)
When I input something and click enter in the Blogsearch_form, my website shows this error: "sqlite3.OperationalError: no such table: blog_post". I also tried replacing blog_post with BlogPost but it shows the same error: "sqlite3.OperationalError: no such table: BlogPost"
I promise you I have a blog_post table in my data.sqlite database. I use it also in some codes below and my website query the data and work just fine (show everything later on in the HTML file):
page = request.args.get('page',1,type=int)
many_posts = BlogPost.query.order_by(BlogPost.date.desc()).paginate(page=page, per_page=10)
return render_template('index.html', many_posts=many_posts, form=search)
Addtionally, for more clarity, these codes below (which is in the init.py of my app.py file) are those that create my data.sqlite database:
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + os.path.join(basedir,
'../Upchanges/data.sqlite')
So, I really think that something is wrong with the path in the _sqlite3.connect command. I would greatly appreciate if you could help me fix the problem above.
(Please use the picture when you think about the path as it is easier to imagine)
Thank you!!
P.s/ I tried this from my previous question: https://stackoverflow.com/a/61849867/13097721 but I still couldn't fix this problem.
In fact you do not necessarily know from which directory the application has been started from.
You could test this with
import os
print("My current working directory is", os.getcwd())
However if your database is located relative to a python file, then it is better to determine the path relative to the python file's directory than trying to determine it relative to the current working directory.
You can do this for example with with
import os
MYDIR = os.path.dirname(__file__)
SQLPATH = os.path.join(MYDIR, "..", "..", "data.sqlite")
I also suggest to add following line in your existing code or in your modified code.
It will show you exactly where you point to
print("THE REAL SQLPATH is", os.path.realpath(SQLPATH))
Comment after your feedback:
I Will add more prints (and remove one "..") but you will see what you need to do with two more prints.
import os
print("MY FILE = ", os.path.realpath(__file__))
MYDIR = os.path.dirname(__file__)
print("MYDIR = ", os.path.realpath(MYDIR))
SQLPATH = os.path.join(MYDIR, "..", "data.sqlite")
print("This gives me SQLPATH = ", os.path.realpath(SQLPATH))
Comment after next Feedback
So it seems how, that the database file is now properly located.
I suggest to inspect your database.
from a shell window type:
sqlite3 /Users/kienletrung/Desktop/Upchanges_desperate/Upchanges/data.sqlite
type then commands like
.tables to list all tables .schema <tablename> to look at a tables format. or select * from tablename; to look at all rows of a table.
alternatively just type:
echo .dump | sqlite3 /Users/kienletrung/Desktop/Upchanges_desperate/Upchanges/data.sqlite > fulldump.txt
to create a full ascii dump of your database.
If you don't have the sqlite command line client installed then just install it. It should simplify debugging.
You can of course use any other tool to inspect your data base in look at what tables / structures your python code expects and what the database really contains.

How i can use environment variables on .ini file in Pyramid?

For example, now I have this .ini file:
[DEFAULT]
db_user = user_test
db_password = psw_test
db_host = localhost
db_name = db_test
...
I want to do:
export ENV_DB_USER=user_test
export ENV_DB_PSW=psw_test
export ENV_DB_HOST=localhost
export ENV_DB_NAME=db_test
Then I want to do somethings like this in .ini file:
[DEFAULT]
db_user = ${ENV_DB_USER}
db_password = ${ENV_DB_PSW}
db_host = ${ENV_DB_HOST}
db_name = ${ENV_DB_NAME}
...
I have try to use %()s syntax but unfortunately not working.
Where I'm doing wrong?
Thanks :)
Unfortunately they are not supported right now - you have to load them at runtime instead of in the ini file itself. A common approach in my apps is to define a render-config script which takes a site.ini.in template and renders it out to site.ini using jinja2 with env vars available to it. It's not very satisfying but it does work well.
import jinja2
import os
def multiline(val):
lines = (l.strip() for l in val.strip().split('\n'))
return '\n ' + '\n '.join(lines)
def main(cli, args):
env = jinja2.Environment(
loader=jinja2.FileSystemLoader(os.getcwd()),
undefined=jinja2.StrictUndefined,
)
env.filters['multiline'] = multiline
template = env.get_template(args.template)
result = template.render({
'env': os.environ,
})
with open(args.config_file, 'w', encoding='utf8') as fp:
fp.write(result)

Is there a way to remove all the special characters from a mysql result set using Python and paramiko?

The script I'm using to gather date is listed here. It works, but the output file has way too many junk characters for me to be able to work with correctly.
import paramiko
import os
savefile = 'dump.sql'
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(
paramiko.AutoAddPolicy())
ssh.connect('123.34.4.4',username='jlefler',password='paSSw0rd')
def sqlcheck(cmd):
out = []
msg = [stdin,stdout,stderr] = ssh.exec_command(cmd)
for item in msg:
try:
for line in item:
out.append(line.strip('\n'))
except: pass
return(list(out))
dump = sqlcheck('mysql -h database -u user -ppassword database -e "select user_id from users limit 50"')
file = open(savefile, 'w')
file.write(str(dump))
file.close()
print 'The dump had '+str(len(dump))+ ' lines and was saved to '+str(os.path.realpath('dump.sql'))
My output files looks like this:
[u'user_id', u'\\name1VO', u' name2', u' name3', u'Warning: Using a password on the command line interface can be insecure.']
Is there any way I can utilize paramiko and Python to just download the normal file without any of the odd formatting? I'd like to be able to utilize this for multiple queries, and mysqldumps - but I can't do that if I can't retain the formatting.

File not being released

Ok so I have to run some queries in access 07 then compact and repair it. I am using python and win32com to do this. The code I'm currently using is this.
import os;
import win32com.client;
DB1 = 'db1.mdb'
DB2 = 'db1N.mdb'
DB3 = 'db2.mdb'
DBR = r'db1.mdb'
access = win32com.client.Dispatch("Access.Application")
access.OpenCurrentDatabase(DBR)
DB = access.CurrentDb()
access.DoCmd.OpenQuery("1")
access.DoCmd.OpenQuery("2")
access.DoCmd.OpenQuery("3")
access.CloseCurrentDatabase()
access.Application.Quit();
os.system('copy "db1.mdb" "db2.mdb"')
access = win32com.client.Dispatch("Access.Application")
access.CompactRepair(DB3,DB2)
access.Application.Quit();
os.remove("db2.mdb")
os.remove("db1.mdb")
os.rename("db1N.mdb","db1.mdb")
The problem is that I get this error.
WindowsError: [Error 32] The process cannot access the file because it is being used by another process: 'db1.mdb'
I dont know why I am getting this error seeing as I am quiting access which should close the file. Any Idea how to fix this would be greatly appreciated.
Your code includes this line:
DB = access.CurrentDb()
When that line executes, CurrentDb() references db1.mdb. Then you later get WindowsError at:
os.remove("db1.mdb")
So I'm wondering if the variable DB still holds a reference to db1.mdb. Perhaps you could first try del DB before os.remove().
Or, since your code isn't actually using that DB variable, just get rid of it.
I made some changes to get your example to work for me. To solve your problem add DB.Close();
import os;
import win32com.client;
path='C:/project/714239'
os.chdir(path)
DB1 = 'db1.mdb'
DB2 = 'db1N.mdb'
DB3 = 'db2.mdb'
access = win32com.client.Dispatch("Access.Application")
access.OpenCurrentDatabase(path + '/' + DB1, False)
DB = access.CurrentDb()
DB.Close(); #ADDED THIS
access.CloseCurrentDatabase()
access.Application.Quit();
os.system('copy ' + DB1 + ' ' + DB2)
access = win32com.client.Dispatch("Access.Application")
access.CompactRepair(path + '/' + DB2, path + '/' + DB3, True)
access.Application.Quit();
os.remove(DB2)
os.remove(DB1)
os.rename(DB3,DB1)

Categories