How to create SQL Server on Azure Using Python SDK? - python

I want to create SQL Server and SQL Database using python sdk on Azure.
Does Azure Provides Support for that?

Azure support two ways for creating Azure SQL Database.
Using Azure Service Management(ASM) to create a classic SQL Database (REST API).
Using Azure Resource Manage(ARM) to create a SQL Database (REST API).
For the process of creating Azure SQL Database, you can refer to the document for C# SDK https://azure.microsoft.com/en-us/documentation/articles/sql-database-client-library/.
To create SQL Database using Python SDK, according to the api document, it seems to only support ASM mode, please see the sql database managementservice module at http://azure-sdk-for-python.readthedocs.org/en/latest/ref/azure.servicemanagement.sqldatabasemanagementservice.html?highlight=sql%20database.
If you want to create sql database via ARM mode, I think you can try to use the REST API Create or Update Database that need to be authenticated by using Python SDK for Resource Management Authentication.

You can check the python sdk for azure, I find that the azure.servicemanagement.sqldatabasemanagementservice module which provides the function to create a new SQL Server and SQL Database, please check the detail in this article. So the answer to your question would be yes. Hope this help you.

Related

Connect to Azure sql in Python Using Service Principal

I have a Service principal with a client id and client secret. It has permission to the Azure SQL DB. I want to use python to generate an access token and use it to authenticate to my sql server. Could someone guide me.
I am new with python and would appreciate if someone could specify if I need any supporting libraries for this to work.
I want to use python to generate an access token.
The Azure Active Directory Authentication library for python can be used to access SQL Server in Azure.
Refer this GitHub repository to know more details and how to implement the same.

Python package or design that are helpful loading data to multiple cloud databases

I have a dictionary of data-frames each represent a table in a databases. I want to load all of these tables to multiple cloud databases including:
AWS DynamoDB
Snowflake (ODBC)
AWS RedShift
AWS DocumentDB
Azure CosmosDB
GCP Spanner
GCP BigQuery
GCP FireStore
GCP MemoryTable
AWS Elaticache
AWS Neptune
AWS QLDB
Is there any design pattern I can employ to deal with this problem? How about python packages that can connect to those databases. Are there any similar code on GitHub?
Or it's easier to first load the data to one database such as MongoDB and then migrate it to other databases.
It sounds like you want to use some ORM frameworks like SQLAlchemy for Python in programming or some ETL tools like Kettle (it's based on JDBC) to do the data transfer from dataframes to multiple cloud databases.
However, I think there is not a single solution which can support these whole different databases, as the figures below.
Fig 1. SQLAlchemy supports these databases, the screenshot comes from SQLAlchemy Dialects page
Fig 2. The classic ETL tool, Kettle, supports these databases, the screenshot comes from Kettle Special database issues and experiences page
So a mixed solution is:
1. Manually to write a Python script using SQLAlchemy
2. Directly use Kettle with some necessary plugins which support other additional databases (such as inquidia/PentahoSnowflakePlugin for AWS DynamoDB Snowflake), even try to integrate with Python via its CPython Script Executor.
Hope it helps.

How to access Google Cloud SQL by Python in Google Compute Engine

I am new to programming and Google service. I have read this document and set up Cloud SQL access by IP address.
However for access from Python, the documentation is only for App Engine and using Proxy instead. How can I access from Python in Google Compute Engine? Thanks
Credit #Mangu
Depending upon the application and the requirements, you can configure to connect to the Cloud SQL from python in compute engine using external IP of the Cloud SQL instances or use Cloud SQL Proxy with python (The Cloud SQL Proxy is available only for Second Generation instances.)
Here is another complete example tutorial of using Cloud SQL with Python Bookshelf application.

Read/Write Mysql DB using Python API

Is there any Python API that will allow me to interact with a remote MySQL db (running on RDS) in a transactional manner (read, write)? I believe this can be done using Boto for DynamoDB but I couldn't find anything similar for MySQL.
Any suggestions?
Thanks,
Bucho
There is no such thing in Boto,
The easiest and simplest library to interact with MySQL using python in my opinion is MySQLdb.

How can use Google App Engine with MS-SQL

I use
python 2.7
pyodbc module
google app engine 1.7.1
I can use pydobc with python but the Google App Engine can't load the module. I get a no module named pydobc error.
How can I fix this error or how can use MS-SQL database with my local Google App Engine.
The Google App Engine does not support access to your own SQL server, and does not support loading C-API libraries of your own.
You can use the Google Cloud SQL storage, which is implemented with MySQL databases, which you can access via their Cloud SQL API for Python.
Note that the rdbms module Google provides implements the PEP 249 Python database API 2.0 specification, just like the pyodbc module, so you should have no problems using it:
from google.appengine.api import rdbms
conn = rdbms.connect(instance=INSTANCE_NAME, database=DATABASE)
cursor = conn.cursor()
cursor.execute('SELECT * FROM sometable')
You could, at least in theory, replicate your data from the MS-SQL to the Google Cloud SQL database. It is possible create triggers in the MS-SQL database so that every transaction is reflected on your App Engine application via a REST API you will have to build.

Categories