I'm working with fabric3 (https://pypi.python.org/pypi/Fabric3) , a python 3 port of fabric.
I have the following function wchich I'm running locally in win7 using git-bash:
#roles('production')
def dir():
env.key_filename = '~/.ssh/deploy'
local("git push mysite master")
run('pwd')
run('ls')
code_dir = '/home/deploy/mysite'
with cd(code_dir):
run('pwd')
run('git reset --hard master')
run('ls -la')
output:
$ fab dir
[deploy#198.x.x.x] Executing task 'dir'
[localhost] local: git push mysite master
deploy#198.x.x.x's password:
When I run the function, I get asked for the password. It seems to be ignoring the key. How can I get the function to use the prescribed key?
I added git as a user to my .ssh/config file and it now appears to work.
Host deploy
HostName 198.x.x.x
User deploy
PreferredAuthentications publickey
IdentityFile ~/.ssh/deploy
IdentitiesOnly yes
Host 198.x.x.x
HostName 198.x.x.x
User git
IdentityFile ~/.ssh/deploy
Related
I have created a docker container to run my python program inside.
My program requires retrieving the known_host under my .ssh folder:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, password=password)
I have mounted it into the docker container using:
docker run --name test_cntr --rm \
-v $SCRIPT_DIR:/home/ \
-v $DATA_DIR:/home/data \
-v $HOME/.ssh/known_hosts:/root/.ssh/known_hosts \
-e PYTHONPATH=/home/sciprt_dir:/home/sciprt_dir/lib \
-e INDEX=0 \
dummy_image python /home/run.py
Found that my program can successfully get the known_hosts file sometimes, but sometimes not, below error is shown:
Exception is [Errno -2] Name or service not known
I didn't re-run the container during the run.py execution. Suppose the known_hosts mounted to the container at the beginning and run.py should be able to use it throughout whole running.
At the end I found that, one of the servers using for this program, did not register on the domain server, so that sometimes my program works when using server that is registered, and sometimes it does not work when the server is not registered.. Thanks all for help!
I'm working on win7 and trying to use fabric to push changes to an ubuntu 16.04 VPS. So far I have:
env.roledefs = {
'test': ['localhost'],
'dev': ['user#dev.example.com'],
'production': ['deploy#xxx.xx.xx.xx']
}
#roles('production')
def dir():
env.key_filename = '~/.ssh/id_rsa'
local("pip freeze > requirements.txt")
local("git add . --all && git commit -m 'fab'")
local("git push myproject master")
run('pwd')
...
When I run this the output is:
$ fab dir
[deploy#xx.xx.xx.xx] Executing task 'dir'
[localhost] local: pip freeze > requirements.txt
[localhost] local: git add . --all && git commit -m 'fab'
warning: LF will be replaced by CRLF in .idea/workspace.xml.
The file will have its original line endings in your working directory.
[master warning: LF will be replaced by CRLF in .idea/workspace.xml.
The file will have its original line endings in your working directory.
256de92] 'fab'
warning: LF will be replaced by CRLF in .idea/workspace.xml.
The file will have its original line endings in your working directory.
3 files changed, 10 insertions(+), 9 deletions(-)
[localhost] local: git push example master
debug1: Connecting to 198.91.88.101 [198.91.88.101] port 22.
debug1: connect to address 198.91.88.101 port 22: Connection refused
ssh: connect to host 198.91.88.101 port 22: Bad file number
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
Fatal error: local() encountered an error (return code 128) while executing 'git push example master'
So fabric is trying to push to the wrong target ip address (this was an old vps address . I no longer have it.) I got rid of the VPS but saved the public and private key and uploaded the pub key to my new vps at a new ip address
The problem is I'm not sure where the old target address is being set. Is this a git issue. How do I redirect fabric to push to #roles('production')
When I look in my .ssh/known_hosts I see 198.91.88.101. So I'm wondering if that is involved in some way.
It is in the git remote configuration. Verify with git remote --verbose.
I have windows 7
I am trying:
env.hosts = ['xxx.xx.xx.xxx', 'xxx.xx.xx.xxx', 'xxx.xx.xx.xxx']
env.user = 'root'
env.key_filename = 'C:\Users\Doniyor\Desktop\ssh\secure-life\privkey.ppk'
def dm():
app_path = '/var/www/myproj/'
env_path = '/var/www/virtualenvs/myproj'
with cd(env_path):
run('. bin/activate')
with cd(app_path):
run('git pull origin master')
run('python manage.py collectstatic --settings=myproj.settings')
run('python manage.py migrate --settings=myproj.settings')
run('touch conf/uwsgi.ini')
But it keeps asking for root password:
what is missing here? I am fighting for almost 2 days now for it..
Add that private key as an SSH key for user root on all those servers with:
ssh-copy-id root#123.45.56.78
Make sure the SSH Agent is also running. SSH Agent is what uses your private key for authentication when you try to login to a remote server using SSH. Check Running SSH Agent when starting Git Bash on Windows
I want to be able to log in to my AWS postgres database from a remote machine. I am using the following Fabric script:
import sys
from fabric.api import env, run, abort
env.port = 123
env.use_ssh_config = True
def setuser(user):
"""Sets the ssh user for the fabric script"""
env.user = user
env.password = 'mypassword'
def setenv(server):
"""Sets the environment for the fabric script"""
env.hosts = ['staging']
def sync():
# log into AWS server
run("psql --host=staging.xxx.rds.amazonaws.com --username=x_user --port=5432 --password --dbname=x_database")
run("mypassword")
I run this Fabric script using the following command:
fab -f sync_staging.py sync --password=mypassword
This logs me into the remote machine, runs the line run("psql .... and then it prompts me for a password:
[stage] out: Password for user x_user:
Is there any way that I can supply the password (or respond to the prompt) such that it logs me in automatically?
There are 2 ways of solving this that I know of:
.pgpass password file in your home directory on remote host
PGPASSWORD env variable (set on remote host)
If you need to set an environment variable on remote host, use with shell_env(PGPASSWORD='mypassword'), Fabric docs here: fabric.context_managers.shell_env
Hope it solves your problem.
I'm writing a fab script to do a git pull on a remote server, but I get Permission denied (publickey,keyboard-interactive). when fabric runs the command.
If I ssh to the server and then do the pull, it works. (I've already setup the keys on the server, so it doesn't ask for passphrases, etc.)
Here's my fabric task:
import fabric.api as fab
def update():
'''
update workers code
'''
with fab.cd('~/myrepo'):
# pull changes
print colors.cyan('Pulling changes...')
fab.run('git pull origin master')
How do I get it to work with Fabric?
Edit: My server is a Google Compute instance, and it provides a gcutil tool to ssh to the instance. This is the command it runs to connect to the server:
ssh -o UserKnownHostsFile=/dev/null -o CheckHostIP=no -o StrictHostKeyChecking=no -i /Users/John/.ssh/google_compute_engine -A -p 22 John#123.456.789.101
The script is able to connect to the server AFAICT (it's able to run commands on the server like cd and supervisor and git status), it's just git pull that fails.
you need to edit fabfile like this in order to enable ssh agent fowarding option.
from fabric.api import *
env.hosts = ['123.456.789.101']
env.user = 'John'
env.key_filename = '/Users/John/.ssh/google_compute_engine'
env.forward_agent = True
def update():
'''
update workers code
'''
with cd('~/myrepo'):
# pull changes
print colors.cyan('Pulling changes...')
run('git pull origin master')