I have a complex SQL script that updates several tables, and a different Python script to create reports. I use MS SQL Server for SQL, and VSCode for Python. I have to switch applications to run each file independently.
Not sure if this is possible, but could you run SQL code and Python code in separate code chunks within the same file in VSCode? I don’t mean execute the SQL script in Python (this doesn’t work because the query has dozens of temp tables and multiple outputs), but run them independently in the same file -so I only run one file for my outputs.
All ideas are welcomed. Thanks.
Related
I need to run a python script when an excel file gets dropped in onedrive for business. The python script needs to read from that file and then do some actions. I have been searching aroud and I see that azure functions app is the way to go. I wanted to know if there are any other options to get this done? I see in powerautomate desktop that it runs python 2 code? I need to use python 3 Anyway around that? Calling Python script with the variable(file in onedrive) using powershell?
Here's what I want to automate:
get data from an API,
do some cleaning and wrangling in python,
save as Excel,
load to PowerBI and
produce very simple dashboard.
I've been searching for ages on Microsoft forums and here to find out if there's a way to run PowerBI via a script, like a Powershell script, for instance. I'm comfortable with writing a script for doing the first 3 steps, but I have not been able to find a solution for steps 4-5. It has to be loaded to PowerBI because that's what clients are most familiar with. The ideal result would be a script that I can then package as an executable file, send to a non-technical person so they can run it at their own leisure.
Although I'd prefer a solution in python, if it's possible in SQL or R, I'd also be very happy. I've tried all the in-PowerBI options for using python scripts, but I have found them limited and difficult to use. I've packaged all my ETL functions into a py script, then imported it to PowerBI and ran functions therein, but it's still not really fully automated, and wouldn't land well with non-technical folks. Thanks in advance! (I am working on PC, PowerBI Desktop)
I've made a python application the relies on a mysql database. Is there anyway I could somehow save the mysql database into a file format along with the code to be distributed to my teacher? He needs to be able to view the code and run it, so I am not able to export it to a .exe file. I'm currently using MYSQL workbench to run my database on mac. Where as my teacher uses a PC and uses Microsoft access
Basically, I want it where I have a folder that container my program saved as a .py file, and the mysql file in the same folder, and my teacher will simply open my code and run it, and the program fully works and runs with the database fully integrated.
I think you have 2 options.
Option A would be to export a so called MySQL dump, send it with the files and use Python to import it into the MySQL database. This option requires that your teacher has MySQL installed.
Option B would be to not use a database application and work with a CSV file instead. With this option you can just send the CSV file with your code. This option obviously requires to change your Python code so it uses the CSV file instead of the database.
From my point of view (I may miss something) you should not have a database installer in your files which installs a DB on your teachers‘ computer
I had been trying to run python from SSIS. SO i needed to create a package in sql server. I can run small scripts in sql server but I am not sure how to run scripts.
Below works. But my python code is in test_db.py How do I run that python script in sql server?
EXEC sp_execute_external_script #language = N'Python',
#script = N'print(3+4)'
STDOUT message(s) from external script:
7
If the Python engine is installed on the server where you are trying to run this script, you can use the execute process task and call the python.exe. Pass the .py file as an argument to the task and that will run the script as well.
There are two approaches to execute python scripts from SSIS:
(1) Executing Python Script using Execute Process Task
You can use execute the python script from an Execute Process Task to a Flat File then read from the flat file to SQL Server, you can refer to the following link for more information:
Scraping data with SSIS and Python
(2) Using IronPython
IronPython is an open-source implementation of the Python programming language which is tightly integrated with the .NET Framework. IronPython can use the .NET Framework and Python libraries, and other .NET languages can use Python code just as easily.
You can use a Script Component to integrate IronPython library:
I didn't used this library before and i don't know if i can help. I have read a comment wrote by #billinkc linking to the answer below which contains an amazing guide on how to do that:
SSIS: Execute Ironpython or Ironruby scripts through SSIS
References
Predicting data with Python script in an SSIS package
How to create a SSIS package to ETL JSON from Python REST API request into MSSQL server?
I want to implement a simple application (for Windows), mainly for data collection and I should be able to run some queries.
I can produce an exe file from the python code, but I don't want to install the database separately, rather I'd include it in an installer. I think some other applications do the same or they require MySQL to be installed -- not in development mode -- since I saw in Windows' Programs and Features that there are some MySQL installations.
So my question is: Is there a way to use MySQL or a specific preparation method of an application so I am able to use some database operations inclusive saving data? Remark: any database would work, I was referring to MySQL because I suspect I saw some examples.
You could use Pyinstaller to make an exe out of your database and Python file. Python natively supports sqlite so you don't have to install anything extra if you use that.