Execute MySQL script from Arduino Yun (without Python) - python

I want to retrieve data from MySQL database which is running directly in Arduino YUN itself.
I want to do it without Python, but directly using MySQL commands and Process. Is it possible?
Every post that I found in the internet is about to "how to retrieve data using python". But I don't want to use python, because connecting do database and etc. routine slows down my queries.
I'm retrieving data from database multiple times, and from different tables. So I moved logic of data retrieval to MySQL function, and now want to just call this function using Process class. The problem is that it works directly from mysql console, works using python, but doesn't work directly. When I say directly, I mean this:
Process process;
process.begin("python");
process.addParameter("/mnt/sda1/arduino/python/read.py");
process.addParameter(msg);
process.run();
This code works perfectly and uses python. Inside read.py file I have database call routine. So I want to do the same thing, but without python:
Process process;
process.begin("mysql");
process.addParameter("-u root -parduino --database arduino -e \"select process('my_text')\"");
process.run();
As you can understand, this code sample doesn't work. But same Mysql script works perfectly, and much faster than using python, if I run it through console.

Try Process.runShellCommand. It will accept your complete statement as an argument:
Process process;
process.runShellCommand("mysql -u root -parduino --database arduino -e \"select process('my_text')\"");

You can use subprocess module.
https://docs.python.org/2/library/subprocess.html
Using this you can send your mysql commands to the terminal as if sending directly through python script.

Related

executing long running hive queries from remote machine

I've to execute long running (~10 hours) hive queries from my local server using a python script. my target hive server is in an aws cluster.
I've tried to execute it using
pyhs2, execute('<command>')
and
paramiko, exec_command('hive -e "<command>"')
in both cases my query will be running in hive server and will complete successfully. but issue is even after successfully completing the query my parent python script continue to wait for return value and will remain in Interruptible sleep (Sl) state for infinite time!
is there anyway I can make my script work fine using pyhs2 or paramiko? os is there any other better option available in python?
As i mentioned before that even I face a similar issue in my Performance based environment.
My use-case was i was using PYHS2 module to run queries using HIVE TEZ execution engine. TEZ generates lot of logs(basically in seconds scale). the logs gets captured in STDOUT variable and is provided to the output once the query successfully completes.
The way to overcome is to stream the output as an when it is generated as shown below:
for line in iter(lambda: stdout.readline(2048), ""):
print line
But for this you will have to use native connection to cluster using PARAMIKO or FABRIC and then issue hive command via CLI or beeline.

How to run a single process multiple times in parallel using a UNIX shell script?

I am working on one of the MPP databases and would like to run a single SQL query using multiple sessions in python or UNIX shell script. Can somebody share your thoughts on spawning a SQL in python/UNIX utility. Any inputs would be appreciated. Thank you.
Code :-
for i in {1..$n}
do
( sh run_sql.sh test.sql touchstone_test & )
done
For python you could download the MySQLdb module. MySQLdb is an interface for connecting to a MySQL database server from Python. It implements the Python Database API v2.0 and is built on top of the MySQL C API. More info.
Depending on scale, you can choose either option.
If you have small tasks to accomplish, running a shell script is fine. Note that you can also pipe the query to the mysql CLI client, e.g.
mysql_cmd="mysql -h<host> -u<user> -p<pwd> <db>"
echo "SELECT name, id FROM myobjects WHERE ...." | $mysql_cmd
For a larger scale project, I would go with Python and the mysqldb interface that was mentioned already.

Automating jobs on remote servers with python2.7

I need to make a python script that will do these steps in order, but I'm not sure how to go about setting this up.
SSH into a server
Copy a folder from point A to point B (cp /foo/bar/folder1 /foo/folder2)
mysql -u root -pfoobar (This database is accessible from localhost only)
create a database, do some other mysql stuff in the mysql console
Replaces instances of Foo with Bar in file foobar
Copy and edit a file
Restart a service
The fact that I have to ssh into a server, and THEN do all of this is really confusing me. I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
Look into Fabric more. It is still probably what you want.
This page has a lot of good examples.
By "context" I'm assuming you want to be able to cd into another directory and run commands from there. That's what fabric.context_managers.cd is for -- search for it on that page.
Sounds like you are doing some sort of remote deployment/configuring. There's a whole world of tools out there to professionally set this up, look into Chef and Puppet.
Alternatively if you're just looking for a quick and easy way of scripting some remote commands, maybe pexpect can do what you need.
Pexpect is a pure Python module for spawning child applications; controlling them; and responding to expected patterns in their output.
I haven't used it myself but a quick glance at its manual suggests it can work with an SSH session fine: https://pexpect.readthedocs.org/en/latest/api/pxssh.html
I have never used Fabric.
My way to solve those kind of issues (before starting to use saltstack) it was using pyexpect, to run the ssh connection, and all the commands that were needed.
maybe the use of a series of sql scripts to work with the database (just to make it easier) would help.
Another way, since you need to access the remote server using ssh, it would be using paramiko to connect and execute commands remotely. It's a bit more complicated when you want to see what's happening on stdout (while with pexpect you will see exactly what's going on).
but it all depends from what you really need.

How to Use Python to execute initial commands over the terminal and then continue use

I have looked into using pxssh ,subprocess and paramiko but have found no success. What I am ultimately trying to do is figure out a way to not only use SSH to access a server and execute commands using a python script and finish there, but also have it open an instance of the terminal after executing all the commands for continued use.
Currently the server has modules that clients have to manually activate using commands after they have established an SSH connection.
For example:
module python
This command would give the user access to python.
Following this the user would then be able to use python and all its commands through the ssh connection in the terminal.
The issue I have with the methods listed earlier for executing these commands is that it does not display an instance of the terminal. It successfully executes the commands but since these commands have to be executed every time a new SSH connection is established it's worthless unless I can get essentially a copy of the terminal that the Python script executed and loaded up all the modules with.
Does any one have a solution to this? I've scoured the web for hours to no success.
This is a very difficult issue to explain so if anything is unclear please ask me and I will try my best to rephrase things. I am very new to all this.

Getting access to VM's console with Python

I am writing a Python script that creates and runs several VMs via virsh for the user. Some of the configuration has to be done by executing commands inside the VM which I would want to do automatically.
What would be the easiest way to get remote shell access in Python? I am considering the following approaches:
To use the virsh console command as a sub-process and do I/O to it.
To bring up an SSH session to the VM. I can configure the VM before it boots by editing its file system so I know its target IP address.
Any better API for doing this. RPC?
I need to get the return values for commands so I know if they executed correctly or not. For that matter I need to be able to detect when a program I invoke has finished. Options #1 and #2 rely on scraping the output and that gets complex.
Any suggestions much appreciated.

Categories