I have a GCP workspace, complete with a Postgresql database. On a frequent basis, I need to insert and/or select rows from the db. I've been searching for a python script that will (A) connect to GCP, then (B) connect to the db, then (C) query a specific table. I'd prefer not to hard code my credentials if possible, that way I could share this script with others on my team, and provided that they were authorized users, it would run without any hiccups.
Does anyone have such a script?
I believe I just answered your question here: Access GCP Cloud SQL from AI notebook?
Using the Cloud SQL Python Connector which was mentioned in the other post, you can run a script that looks something like this to connect to your database and run a query:
# Copyright 2021 Google LLC.
# SPDX-License-Identifier: Apache-2.0
import os
from google.cloud.sql.connector import connector
# Connect to the database
conn = connector.connect(
os.getenv("INSTANCE_CONNECTION_NAME"),
"pg8000",
user=os.getenv("DB_USER"),
password=os.getenv("DB_PASSWORD"),
db=os.getenv("DB_NAME")
)
# Execute a query
cursor = conn.cursor()
cursor.execute("SELECT * from my_table")
# Fetch the results
result = cursor.fetchall()
# Do something with the results
for row in result:
print(row)
The instance connection name should be in the format project:region:instance. If you don't want to hard code database credentials, you can read them in from environment variables instead.
Related
I want to import data from dynamodb table into SQL Server.
I use Python botot3.
Basically, you need to use pymssql:
A simple database interface for Python that builds on top of FreeTDS
to provide a Python DB-API (PEP-249) interface to Microsoft SQL
Server.
You create a connection:
conn = pymssql.connect(server, user, password, "tempdb")
cursor = conn.cursor(as_dict=True)
Then you can use execute or executemany to built an ISNERT statement.
It will be better if you are able to save this data in CSV file and then use BULK INSERT statement as it will be faster if you are working with large amount of data.
I created an Azure Function with Python and want to write some data into an Azure SQL DB.
If I run the code on my local machine via AZ Function Debugger, everything is working. But when I deploy everything to Azure, I only get a message that there is an error (no additional specific information).
I think this is related to the ODBC Driver?
I'm using the following code to connect and insert data:
with pyodbc.connect('DRIVER='+driver+';SERVER=tcp:'+server+';PORT='+port+';DATABASE='+database+';UID='+username+';PWD='+ password + ";Authentication=ActiveDirectoryPassword", timeout=120) as conn:
with conn.cursor() as cursor:
try:
cursor.execute(data)
except:
logging.error("Can't execute SQL Query!")
I use driver= '{ODBC Driver 17 for SQL Server}' as driver.
I assume that this is missing in Azure? How can this issue be fixed? What is the right approach to connect from Azure Functions to an Azure SQL DB via Python?
It seems the ODBC driver is included, it was just poorly documnented:
https://github.com/MicrosoftDocs/azure-docs/issues/54423
There is an example project here:
https://github.com/kevin808/azure-function-pyodbc-MI
The full tutorial including creating the system assigned identity can be found here:
https://techcommunity.microsoft.com/t5/apps-on-azure-blog/how-to-connect-azure-sql-database-from-python-function-app-using/ba-p/3035595
There is currently a SQL Extension under development but it only supports C# at the moment. Python has been requested as an ehancement so you could add your 👍 to the issue so that you could use bindings
https://github.com/Azure/azure-functions-sql-extension/issues/172
I'm trying to connect to an Oracle database within a python script, I'm not allowed to use any 3rd party imports/downloads, only the python standard library, like cx_oracle, which is the only solution to this I've found. I'm not super familiar with oracle databases, could someone explain how to connect and query without using cx_oracle and things like it.
Sourced from the documentation:
https://cx-oracle.readthedocs.io/en/latest/installation.html#quick-start-cx-oracle-installation
Example:
import cx_Oracle
# Connect as user "hr" with password "welcome"
# to the "oraclepdb" service running on this computer.
connection = cx_Oracle.connect("hr", "welcome", "localhost/orclpdb")
cursor = connection.cursor()
cursor.execute("""
SELECT first_name, last_name
FROM employees
WHERE department_id = :did AND employee_id > :eid""",
did = 50,
eid = 190)
for fname, lname in cursor:
print("Values:", fname, lname)
Oracle's network protocol isn't public so you need either (i) some Oracle technology installed on your computer that knows that protocol - this is cx_Oracle and Oracle Instant Client (ii) or something like Oracle's ORDS product running on the database which will let you use REST calls.
If you need to interact with an Oracle Database you could make a very strong argument that you need to install cx_Oracle and Oracle Instant Client. cx_Oracle is on PyPI so it can be installed like any other Python package you need. Instant Client needs to be installed separately, but is the Oracle product that you could be expected to require to connect to Oracle DB.
I want to load a database from a remote server to my memory/cache, so that I don't have to make network calls every time I want to use the database.
I am doing this in Python and the database is cassandra. How should I do it? I have heard about memcached and beaker. Which library is best for this purpose?
If you are trying to get some data from a database, use the pyodbc module. This module can be used to download data from a given table in a database. Answers can also be found in here.
An example how to connect:
import pyodbc
cnxn = pyodbc.connect('DRIVER={SQLServer};SERVER=SQLSRV01;
DATABASE=DATABASE;UID=USER;PWD=PASSWORD')
cursor = cnxn.cursor()
cursor.execute("SQL_QUERY")
for row in cursor.fetchall():
print row
I want to insert data into a CloudSQL MySQL database from a local Python application, is this possible, if so how?
I have tried running the examples at the bottom of https://cloud.google.com/appengine/docs/python/cloud-sql/#Python_complete_python_example
db = MySQLdb.connect(unix_socket='/cloudsql/PROJECT-ID:INSTANCE-NAME', user='phil')
cursor = db.cursor()
cursor.execute('SHOW VARIABLES')
for r in cursor.fetchall():
webapp2.RequestHandler.response.write('%s\n' % str(r))
db.close()
However I get the error:
`Can't connect to local MySQL server through socket`
Of course it's possible, but that's the documentation for using it specifically from App Engine. Rather, you should use the docs for connecting from an external application - you'll need to configure access, then set mysqldb to connect via IP rather than a local socket.