I'm struggling getting data from an older postgres database to an newer one.
I have already tried a few things without success. For more information, see Useing ANSI driver to connect to a postgreSQL DB with python psycopg2
The last idea is now to use pg_dump and pg_restore.
from subprocess import PIPE, Popen
# destination handling has to be adjusted.
def dump_table(host_name, database_name, user_name, database_password, table_name):
command = 'pg_dump -h {0} -d {1} -U {2} -p 5432 -t public.{3} -Fc -f /tmp/table.dmp'.format(host_name, database_name, user_name, table_name)
p = Popen(command, shell=True, stdin=PIPE, universal_newlines=True)
return p.communicate('{}n'.format(database_password))
After some research, I came across the above approach. Unfortunately, I haven't really done anything with the shell yet. As I currently understand it, the postgresDB is addressed locally so that the table.dmp is stored there in the /tmp/ directory.
However, I do not have direct access to this directory in order to download files from it. Do I have a possibility to receive the file created here directly in Python in order to process it directly on the target server? Because the target server is the one I have access to.
Related
I need to execute the following command from Python on Windows:
psql -h localhost -p 5432 -U postgres -f script.sql db_name
The above script works fine when ran from git bash / powershell. After entering the script in a terminal, I need to provide a password to confirm it (similar to when using sudo).
How can I do that? I keep finding solutions that I think are linux-based.
How do I do it on Windows? I have tried many variations of solutions involving subprocess, i.e:
import subprocess
p2 = subprocess.Popen(
'psql -h localhost -p 5432 -U postgres -f script.sql db_name',
stdin=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
print('this will print')
sudo_prompt = p2.communicate('THE_PASSWORD' + '\n')[1]
print('this will not')
A better option (more secure) than invoking psql with explicit mention of your password is to have a .pgpass file as described in the docs file (and keep it protected e.g. chmod 600 ~/.pgpass). This keeps your password out of the list of running processes.
On Windows:
On Microsoft Windows the file is named %APPDATA%\postgresql\pgpass.conf (where %APPDATA% refers to the Application Data subdirectory in the user's profile).
I created a Cassandra database in DataStax Astra. I'm able to connect to it in Python (using cassandra-driver module, and the secure_connect_bundle). I wrote a few api in my Python application to query the database.
I read that I can upload csv to it using dsbulk. I am able to run the following command in Terminal and it works.
dsbulk load -url data.csv -k foo_keyspace -t foo_table \
-b "secure-connect-afterpay.zip" -u username -p password -header true
Then I try to run this same line in Python using subprocess:
ret = subprocess.run(
['dsbulk', 'load', '-url', 'data.csv', '-k', 'foo_keyspace', '-t', 'foo_table',
'-b', 'secure-connect-afterpay.zip', '-u', 'username', '-p', 'password',
'-header', 'true'],
capture_output=True
)
But I got FileNotFoundError: [Errno 2] No such file or directory: 'dsbulk': 'dsbulk'. Why is dsbulk not recognized if I run it from Python?
A related question, it's probably not best practice to rely on subprocess. Are there better ways to upload batch data to Cassandra?
I think it has to do with the way path is handled by subprocess. Try specifying the command as an absolute path, or relative like "./dsbulk" or "bin/dsbulk".
Alternatively, if you add the bin directory from the DS Bulk package to your PATH environment variable, it will work as you have it.
I have two Raspberry Pi's. I am trying to transfer files from one Pi to the other using scp. I am trying to do this through Python because the program that will be transferring files is a python file.
below is the shell script I have for the SCP part (Blurred out the pass and IP):
#!/bin/sh
sshpass -p ######## scp test.txt pi#IP:/home/pi
and below is the Python Script that launches that Shell script.
import subprocess
subprocess.call(['./ssh.sh'])
print("DONE")
For some reason the python script doesnt kick back any errors and hits the print line but the file is not transferred. When i run the scp command outside of python the file transfers just fine. Am I doing something incorrect here?
****EDIT****
I cant even get Subprocess to work with this which is why i ended up using na shell script. Here is my attempt with Subprocess:
import subprocess
subprocess.call("sshpass -p ######## scp test.txt pi#IP:/home/pi")
print"DONE"
Again I get no errors, but the file is not transferred
****EDIT #2****
So I found out that because sshpass is being used, scp isnt prompting me to add the IP to known hosts, as a result the file simply isnt trnasferred at all. I need a way to add this acceptance into the script IE I ge the following if I launch the command without sshpass:
The authenticity of host 'IP (IP)' can't be established.
ECDSA key fingerprint is 13:91:24:8e:6f:21:98:1f:5b:3a:c8:42:7a:88:e9:91.
Are you sure you want to continue connecting (yes/no)?
I want to communicate to pass "yes\n" to this prompt as well as the password afterwards. Is this possible?
For the first query
You can use 'subprocess.popen' to get output(STDOUT) and error(STDERR) for the executed command.
import subprocess
cmd = 'sshpass -p ****** scp dinesh.txt root#256.219.210.135:/root'
p = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print "Output is ",out
print "Error is ",err
If you execute above code with wrong password, the you will get below output:
[root#centos /]# python code.py
Output is
Error is Permission denied, please try again.
In this case, if the file is successfully transferred, then there is no output.
If you execute command like 'ls -l' then output will be printed.
For your second query (****EDIT #2****)
Options are :
Password less SSH. Check this.
Pexpect
I found a much easier way of tackling all of this
sshpass -p ###### scp -o StrictHostKeyChecking=no test.txt pi#IP:/home/pi
The -o switch allows me to auto store the IP into known hosts thus I do not need to communicate with the shell at all. The interaction from Python to Shell works with that addition; Doing this solely through subprocess also works.
If you don't mind to try other approaches it worth to use SCPClient from scp import.
I have written a code in Python which accesses Mysql database in my computer.My question is how do I make my program run on other machines i.e how do I transfer the database ??
Thank you for reading...
use the tools that come with MYSQL installation
from command line
backup
mysqldump -u root -p pass21 --databases yourdb > multibackup.sql
restore
mysql -u sadmin -p pass21 Customers < multibackup.sql
Backing-up-and-restoring-your-MySQL-Database
I'm running some deployment tasks with Fabric that needs to checkout/update a Mercurial repository to the machine and then execute the appropriate copying/configuration.
Every time that I instatiate a new machine (we're currently using EC2 for our infrastructure) or when I run hg pull in the machine it'll ask for my ssh key passphrase, that's a bit annoying when we need to initialize a dozen machines at a time.
I've tried to run ssh-add in Fabric when the new EC2 instance is initialized but it seems like that ssh-agent isn't running for that shell and I get a Could not open a connection to your authentication agent. message from the output of Fabric.
How would I make ssh-add work when connected to the instance by the Fabric script?
A comment on fabric's issue tracker solved this for me. It's a modified version of the lincolnloop solution. Using this "run" instead of fabric's will pipe your commands through ssh locally, allowing your local ssh-agent to provide the keys.
from fabric.api import env, roles, local, output
from fabric.operations import _shell_escape
def run(command, shell=True, pty=True):
"""
Helper function.
Runs a command with SSH agent forwarding enabled.
Note:: Fabric (and paramiko) can't forward your SSH agent.
This helper uses your system's ssh to do so.
"""
real_command = command
if shell:
cwd = env.get('cwd', '')
if cwd:
cwd = 'cd %s && ' % _shell_escape(cwd)
real_command = '%s "%s"' % (env.shell,
_shell_escape(cwd + real_command))
if output.debug:
print("[%s] run: %s" % (env.host_string, real_command))
elif output.running:
print("[%s] run: %s" % (env.host_string, command))
local("ssh -A %s '%s'" % (env.host_string, real_command))
Please note that I'm running Fabric 1.3.2, and this fix won't be needed much longer.