I have an API that I built using python fastAPI that pulls from a sqlite database using sqlalchemy, and it works fine in dev but after I deployed it I'm getting--
qlalchemy.exc.OperationalError: (sqlite3.OperationalError) attempt to write a readonly database
The code right now is--
SQLALCHEMY_DATABASE_URL = 'sqlite:///./etfs.db'
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args = {'check_same_thread':False})
How do I connect to the database so that it isn't read-only? Or how do I create the database in sqlite so that it isn't read-only to begin with?
Related
I'm trying to test my FASTAPI app. Seems to me, all settings are correct.
test_users.py
engine = create_engine(
f"postgresql"
f"://{settings.database_username}"
f":{settings.database_password}"
f"#{settings.database_hostname}"
f":{settings.database_port}"
f"/test_{settings.database_name}"
)
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base.metadata.create_all(bind=engine)
def override_get_db():
try:
db = TestingSessionLocal()
yield db
finally:
db.close()
app.dependency_overrides[get_db] = override_get_db
client = TestClient(app)
def test_create_user():
response = client.post(
"/users/",
json={"email": "nikita#gmail.com", "password": "password"}
)
new_user = schemas.UserOutput(**response.json())
assert response.status_code == 201
assert new_user.email == "nikita#gmail.com"
When I run pytest, I get this error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "localhost" (::1), port 5432 failed: FATAL: database "test_social_media_api" does not exist
Why is the code not creating the database?
With engine = create_engine("postgresql://...") you define a connection to an existing PostgreSql database.
And with Base.metadata.create_all(bind=engine) you create the tables - according to your models - in the existing database.
So the code that you written does not create a database, it expects that you give it an already existing database.
And that has to do with PostgreSQL itself.
PostgreSQL runs as a server, and a PostgreSQL server can run multiple databases. And each database has to be created explicitly.
Just telling SQLAlchemy the connection string is not enough.
It's possible to create a new database from Python itself by connecting to the PostgreSQL server (see https://www.tutorialspoint.com/python_data_access/python_postgresql_create_database.htm), or alternatively you can create it manually before you run your script. E.g. by running CREATE DATABASE databasename; inside psql (or any other database tool).
However if you want to test using a running database, I would suggest using testcontainers. They will spawn a new PostgreSQL server with an empty database everytime you run the tests.
Notice, that the example from the FastAPI documentation works differently.
They just use
SQLALCHEMY_DATABASE_URL = "sqlite:///./test.db"
engine = create_engine(
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
)
Base.metadata.create_all(bind=engine)
which creates the database.
This works, because SQLite doesn't run as a server. It's just one file that represents the full database, and if the file doesn't exist, the sqlite database adapter will assume that the database is just empty, and create a new file for you. PostgreSQL doesn't work like this though.
I've followed the documentation from start to end.
I'm trying to connect to a CosmosDb so I can write data to it.
My Databricks cluster runtime version is:
11.3 LTS
I have installed the cosmos DB Spark connector:
com.azure.cosmos.spark:azure-cosmos-spark_3-3_2-12:4.15.0 per the documentation.
I have the following code:
cosmosEndpoint = "MyEndpoint"
cosmosMasterKey = "MyMasterKey"
cosmosDatabaseName = "SampleDB"
cosmosContainerName = "testContainer"
#Configure Catalog Api to be used
spark.conf.set("spark.sql.catalog.cosmosCatalog", "com.azure.cosmos.spark.CosmosCatalog") spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountEndpoint", cosmosEndpoint) spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountKey", cosmosMasterKey)
#create an Azure Cosmos DB database using catalog api
spark.sql("CREATE DATABASE IF NOT EXISTS cosmosCatalog.{};".format(cosmosDatabaseName))
#create an Azure Cosmos DB container using catalog api
spark.sql("CREATE TABLE IF NOT EXISTS cosmosCatalog.{}.{} using cosmos.oltp TBLPROPERTIES(partitionKeyPath = '/id', manualThroughput = '1100')".format(cosmosDatabaseName, cosmosContainerName)) `
I get the error AnalysisException: Catalog 'cosmoscatalog' not found. I have followed the documentation from start to end. Does anyone know why this error occurs?
While connecting Cosmos Db SQL API to Databricks make sure you install the library on cluster correctly most of the time the error we are getting because if libraries are not installed properly.
For connection give appropriate credentials from cosmos db
Code that I tried for connecting Cosmos Db SQL API to Databricks:
cosmosEndpoint1 = "Endpoints"
cosmosMasterKey1 = "masterkey"
cosmosDatabaseName1 = "demo3DB"
cosmosContainerName1 = "demo3Container"
#Configure Catalog Api to be used
spark.conf.set("spark.sql.catalog.cosmosCatalog", "com.azure.cosmos.spark.CosmosCatalog")
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountEndpoint", cosmosEndpoint1)
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountKey", cosmosMasterKey1)
#create an Azure Cosmos DB database using catalog api
spark.sql("CREATE DATABASE IF NOT EXISTS cosmosCatalog.{};".format(cosmosDatabaseName1))
#create an Azure Cosmos DB container using catalog api
spark.sql("CREATE TABLE IF NOT EXISTS cosmosCatalog.{}.{} using cosmos.oltp TBLPROPERTIES(partitionKeyPath = '/id', manualThroughput = '1100')".format(cosmosDatabaseName1, cosmosContainerName1))
Output:
So I am trying to communicate to a Google Cloud SQL Server that I have created with an external python program that I have written in VS Code but I don't know where to begin. Any help will be useful.
I'd recommend using the Cloud SQL Python Connector to manage your connections to Cloud SQL. It supports the pytds driver and should help resolve your troubles for connecting to a SQL Server instance from a Python application.
from google.cloud.sql.connector import connector
import sqlalchemy
# configure Cloud SQL Python Connector properties
def getconn() ->:
conn = connector.connect(
"PROJECT:REGION:INSTANCE",
"pytds",
user="YOUR_USER",
password="YOUR_PASSWORD",
db="YOUR_DB"
)
return conn
# create connection pool to re-use connections
pool = sqlalchemy.create_engine(
"mssql+pytds://localhost",
creator=getconn,
)
# query or insert into Cloud SQL database
with pool.connect() as db_conn:
# query database
result = db_conn.execute("SELECT * from my_table").fetchall()
# Do something with the results
for row in result:
print(row)
For more detailed examples and additional params refer to the README of the repository.
I think you can be inspired by this :Python django
"Run the app on your local computer"
I am using the flask and flask_restplus to build an rest api in python.
The directory structure i followed is same in the link below under the heading
Multiple APIs with reusable namespaces
https://flask-restplus.readthedocs.io/en/stable/scaling.html
Now I am trying to add mySql database connection and for that I used the following code in app.py
from flaskext.mysql import MySQL
mysql = MySQL()
app.config['MYSQL_DATABASE_USER'] = 'name'
app.config['MYSQL_DATABASE_PASSWORD'] = 'test'
app.config['MYSQL_DATABASE_DB'] = 'textclassifier'
app.config['MYSQL_DATABASE_HOST'] = 'localhost'
mysql.init_app(app)
I tried using mysql object through
from app import mysql
which is wrong, i read the whole document but I could not find a way to add the mysql connection without changing the structure.
Does anyone has an idea how i can solve this ?
I am new to flask and rest api in python
I have a database that I am running on my local machine which I can access through Microsoft SQL Server Manager Studio. I connect to this server "JIMS-LAPTOP\SQLEXPRESS" and then I can run queries through the manager. However I need to be able to connect to this database and work with it through python.
When I try to connect using sqlite3 like
conn = sqlite3.connect("JIMS-LAPTOP\SQLEXPRESS")
I get an unable to open database file error
I tried accessing the temporary file directly like this
conn = sqlite3.connect("C:\Users\Jim Notaro\AppData\Local\Temp\~vs13A7.sql")
c = conn.cursor()
c.execute("SELECT name FROM sqlite_master WHERE type = \"table\"")
print c.fetchall()
Which allows me to access a database but it is completely empty (No tables are displayed)
I also tried connecting like this
conn = sqlite3.connect("SQL SERVER (SQLEXPRESS)")
Which is what the name is in the sql server configuration manager but that also returns a blank database.
I'm not sure how I am suppose to be connecting to the database using python
You can't use sqlite3 to connect to SQL server, only to Sqlite databases.
You need to use a driver that can talk to MS SQL, like pyodbc.