PostgreSQL ALTER SEQUENCE using psycopg2 in django - python

I am trying to alter sequence of id field in my table using psycopg. This is works fine on my local server, but not working on production. I am not getting exceptions, the sequence is just not restarting.
def alter_sequence(last_id):
try:
dbname = settings.DATABASES['default']['NAME']
user = settings.DATABASES['default']['USER']
host = settings.DATABASES['default']['HOST']
password = settings.DATABASES['default']['PASSWORD']
port = settings.DATABASES['default']['PORT']
connection = psycopg2.connect(
dbname=dbname,
user=user,
password=password,
host=host,
port=port,
)
cursor = connection.cursor()
cursor.execute('ALTER SEQUENCE "gs_requests_id_seq" RESTART WITH {}'.format(last_id))
connection.close()
except Exception as e:
print(e)
pass
I double-checked the database settings - its correct. Other database operations, performed by django ORM with this settings is works fine.
I think, this is not enough information about my project settings but dont know what information I need to specify. I have postgres 9.6 on my local computer and 10.1 on production.

cursor.execute('ALTER SEQUENCE gs_requests_id_seq RESTART WITH {};'.format(last_id))
Try this!! it will worked for me!

I found source of the problem. I didn't commit:
connection.commit()

Related

FASTAPI testing database not creating database

I'm trying to test my FASTAPI app. Seems to me, all settings are correct.
test_users.py
engine = create_engine(
f"postgresql"
f"://{settings.database_username}"
f":{settings.database_password}"
f"#{settings.database_hostname}"
f":{settings.database_port}"
f"/test_{settings.database_name}"
)
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base.metadata.create_all(bind=engine)
def override_get_db():
try:
db = TestingSessionLocal()
yield db
finally:
db.close()
app.dependency_overrides[get_db] = override_get_db
client = TestClient(app)
def test_create_user():
response = client.post(
"/users/",
json={"email": "nikita#gmail.com", "password": "password"}
)
new_user = schemas.UserOutput(**response.json())
assert response.status_code == 201
assert new_user.email == "nikita#gmail.com"
When I run pytest, I get this error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: FATAL: database "test_social_media_api" does not exist
Why is the code not creating the database?
With engine = create_engine("postgresql://...") you define a connection to an existing PostgreSql database.
And with Base.metadata.create_all(bind=engine) you create the tables - according to your models - in the existing database.
So the code that you written does not create a database, it expects that you give it an already existing database.
And that has to do with PostgreSQL itself.
PostgreSQL runs as a server, and a PostgreSQL server can run multiple databases. And each database has to be created explicitly.
Just telling SQLAlchemy the connection string is not enough.
It's possible to create a new database from Python itself by connecting to the PostgreSQL server (see https://www.tutorialspoint.com/python_data_access/python_postgresql_create_database.htm), or alternatively you can create it manually before you run your script. E.g. by running CREATE DATABASE databasename; inside psql (or any other database tool).
However if you want to test using a running database, I would suggest using testcontainers. They will spawn a new PostgreSQL server with an empty database everytime you run the tests.
Notice, that the example from the FastAPI documentation works differently.
They just use
SQLALCHEMY_DATABASE_URL = "sqlite:///./test.db"
engine = create_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
Base.metadata.create_all(bind=engine)
which creates the database.
This works, because SQLite doesn't run as a server. It's just one file that represents the full database, and if the file doesn't exist, the sqlite database adapter will assume that the database is just empty, and create a new file for you. PostgreSQL doesn't work like this though.

How to Store safely Mysql database credentials in my code

try:
connection = mysql.connector.connect(host='localhost',database='USER',user='root',password='password')
sql_select_Query = "select * from AuthSys WHERE mac = '%s'"%mac
cursor = connection.cursor()
cursor.execute(sql_select_Query)
row_headers=[x[0] for x in cursor.description]
records = cursor.fetchall()
except mysql.connector.Error as e:
return [e]
finally:
if connection.is_connected():
connection.close()
cursor.close()
print("MySQL connection is closed")
I wanted to store the host='localhost',database='USER',user='root',password='password' securely in my python project.So that everyone whoever uses my script will not get access to my database
Note: I am new to stackoverflow.If i wrote something wrong please suggent me right.Thanks in Advance.
You should probably put the credentials in a separate config file that isn't deployed with the project. And pass the path of this file to the main entry of the application, something like this:
python main.py --config=/your-path/to/your-config-file.ini
You will also need to parse this --config argument and then read and parse the your-config-file.ini file.
If you dont have too many such settings one common option is to get them from system environment variables.
user= os.environ["myuser"]
password= os.environ["mypassword"]
connection = mysql.connector.connect(host='localhost',database='USER',user=user,password=password)
See https://12factor.net/ factor 3.
I’d prefix all app settings environment names with something common, giving bkapp_user, bkapp_password.

Having a problem with the connection between a python app and postgres db running in docker

I am having trouble trying to make queries to a postgres db that is running in Docker.
As far as I can tell the database is connecting correctly using the psycopg2 library.
However when I execute a command and try to fetch the results, the results are empty. I can make the same query at the command line using psql and I get the expected results.
Can anyone help me figure out what I am doing wrong?
con = psycopg2.connect(
host= 'localhost',
port= '5432',
database= 'daystarr',
user= 'postgres',
password= 'admin',
)
print('Connected to Postgres Database')
cur = con.cursor()
cur.execute('SELECT * FROM tickets')
rows = cur.fetchall()
for r in rows:
print(r)
print(cur.fetchone())
cur.close()
con.close()
Here is the result when I run in command line:
cur.fetchall() returns nothing and cur.fetchone() returns None.
The other thing is that when I try to execute the command CREATE TABLE tickets (ticket_id INT PRIMARY KEY;)
I get a duplicate table error. So it must be connecting to the table. Thanks in advance!
Here is what it looks like when I just print(cur.fetchall()):
Okay, so this is a weird edge case but if you are looking at this in the future I have found what the problem was. I had a version of PostgreSQL installed and apparently listening on port 5432. My docker container was exposing this port also. So basically my app must have been taking priority on my local machine install. I uninstalled PostgreSQL from my local machine and it found the Docker container immediately. Thank you for anyone who helped.
You are missing the ; in your query, it should be
cur.execute("SELECT * FROM tickets;")

Connecting to MySQL DB using mysql.connector.connect fails with no error to catch

I'm using python to try and connect to a DB. This code worked and something in my environment changed so that the host in not present/accessible. This is as expected. The thing that I'm trying to work out is, I can't seem to catch the error of this happening. This is my code:
def create_db_connection(self):
try:
message('try...')
DB_HOST = os.environ['DB_HOST']
DB_USERNAME = os.environ['DB_USERNAME']
DB_PASSWORD = os.environ['DB_PASSWORD']
message('connecting...')
db = mysql.connector.connect(
host=DB_HOST,
user=DB_USERNAME,
password=DB_PASSWORD,
auth_plugin='mysql_native_password'
)
message('connected...')
return db
except mysql.connector.Error as err:
log.info('bad stuff happened...')
log.info("Something went wrong: {}".format(err))
message('exception connecting...')
except Exception as ex:
log.info('something bad happened')
message("Exception: {}".format(ex))
message('returning false connection...')
return False
I see up to the message('connecting...') call, but nothing afterwards. Also, I don't see any of the except messages/logs at all.
Is there something else I need to catch/check in order to know that a DB connection attempt has failed?
This is running inside an AWS Lambda and was working until I changed some subnets/etc. The key thing is I want to catch it no longer being able to connect.
The issue is most likely that your lambda function is timing out before the database connection is timing out.
First, modify the lambda function to execute for 60 seconds and test. You should find after about 30 seconds you will see the connection to the database timeout.
To resolve this issue, modify the security group on the database instance to include the security group configured for lambda. Use this entry to open a the correct port 3306

Connecting psycopg2 with Python in Heroku

I've been trying for some days to connect my python 3 script to PostgresSQL database(psycopg2) in Heroku, without Django.
I found some article and related questions, but I had to invest a lot of time to get something that I thought should be very straightforward, even for a newbie like me.
I eventually made it work somehow but hopefully posting the question (and answer) will help other people to achieve it faster.
Of course, if anybody has a better way, please share it.
As I said, I had a python script that I wanted to make it run from the cloud using Heroku. No Django involved (just a script/scraper).
Articles that I found helpful at the beginning, even if they were not enough:
Running Python Background Jobs with Heroku
Simple twitter-bot with Python, Tweepy and Heroku
Main steps:
1. Procfile
Procfile has to be:
worker: python3 folder/subfolder/myscript.py
2. Heroku add-on
Add-on Heroku Postgres :: Database has to be added to the appropriate personal app in the heroku account.
To make sure this was properly set, this was quite helpful.
3. Python script with db connection
Finally, to create the connection in my python script myscript.py, I took this article as a reference and adapted it to Python 3:
import psycopg2
import urllib.parse as urlparse
import os
url = urlparse.urlparse(os.environ['DATABASE_URL'])
dbname = url.path[1:]
user = url.username
password = url.password
host = url.hostname
port = url.port
con = psycopg2.connect(
dbname=dbname,
user=user,
password=password,
host=host,
port=port
)
To create a new database, this SO question explains it. Key line is:
con.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
You can do it using the SQLALCHEMY library.
First, you need to install the SQLALCHEMY library using pip, if you don't have pip on your computer install, you will know-how using a simple google search
pip install sqlalchemy
Here is the code snippet that do what you want:
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
import os
# Put your URL in an environment variable and connect.
engine = create_engine(os.getenv("DATABASE_URL"))
db = scoped_session(sessionmaker(bind=engine))
# Some variables you need.
var1 = 12
var2 = "Einstein"
# Execute statements
db.execute("SELECT id, username FROM users WHERE id=:id, username=:username"\
,{"id": var1, "username": var2}).fetchall()
# Don't forget to commit if you did an insertion,etc...
db.commit()
I wasn't able to parse the DATABASE_URL provided by Heroku with the urllib.parse as suggested above, but the following worked for me:
The URL I retrieved from Heroku was in the format:
postgres://username:password#host:port/database
for example:
postgres://jticiuimwernbk:ff78903549d4c6ec13a53a8ffefcd201b937d54c35d976
#ec2-52-123-182-987.compute-1.amazonaws.com:5432/dbsd4fdf6c1awq
So I manually dissected it as follows:
user = 'jticiuimwernbk'
password = 'ff78903549d4c6ec13a53a8ffefcd201b937d54c35d976'
host = 'ec2-52-123-182-987.compute-1.amazonaws.com'
port = '5432'
database = 'dbsd4fdf6c1awq'
#Then created the connection using the above:
con = psycopg2.connect(database=database,
user=user,
password=password,
host=host,
port=port)
# and now I was able to perform queries:
cur = conn.cursor()
results = cur.execute("<some SQL query>;").fetchall()
cur.close()
conn.close()

Categories