subprocess, cannot concatenate 'str' and ... objects - python

I have code like:
if len(sys.argv)>1:
host = sys.argv[1]
number = sys.argv[2] if len(sys.argv)>2 else "1"
size = sys.argv[3] if len(sys.argv)>3 else "56"
timeout = sys.argv[4] if len(sys.argv)>4 else "1"
proc = subprocess.Popen(["ping \"" + host + "\" -c \"" + number + "\" -s \"" + size + "\" -W \"" + timeout + "\" 2>/dev/null | grep packets | sed 's/[^0-9,%]*//g'"], stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
print "address=\"" + host + "\" data=" + out
I need to split out to list. How I can achieve this?
Everything I try causes error: cannot concatenate 'str' and ... objects
Like when I tried:
...
res=list(out)
print "address=\"" + host + "\" data=" + res
I got error:
TypeError: cannot concatenate 'str' and 'list' objects

To unite strings and lists you first must make a string out of the list:
res=list(out)
print "address=\"" + str(host) + "\" data=" + str(res)

Related

Python Postgres query with transaction block over single connection

Currently I have two separate statements being passed through to Postgres (Greenplum).
1. Truncates a table
2. loads data using \copy
myStr="export PGPASSWORD=" + dbPass + "; psql -h " + dbHost + " -p " + dbPort + " -d " + dbName + " -U " + dbUser + " -c " + "\"" + "truncate table " + dbTable + ";\""
print(myStr)
subprocess.call(myStr,shell=True)
myStr="export PGPASSWORD=" + dbPass + "; psql -h " + dbHost + " -p " + dbPort + " -d " + dbName + " -U " + dbUser + " -c " + "\"" + "\\" + "copy " + dbTable + " from " + "'" + csvfile + "' with " + copyOpts + ";" + "select count(*) from " + dbTable + ";\""
print(myStr)
subprocess.call(myStr,shell=True)
Sometimes the load has errors but the truncate already happened, so I'm trying to run the two statements in one connection so I can put a transcation block (BEGIN ... COMMIT;) that way if the data load fails it will rollback to before the truncate happens.
I tried the below method:
myStr="export PGPASSWORD=" + dbPass + "; psql -h " + dbHost + " -p " + dbPort + " -d " + dbName + " -U " + dbUser + " -c " + "\"" + "truncate table " + dbTable + ";" + " \\" + "copy " + dbTable + " from " + "'" + csvfile + "' with " + copyOpts + ";" + "select count(*) from " + dbTable + ";\""
print(myStr)
Which resolves to the command:
export PGPASSWORD=abcde;
psql -h abcde.testserver.corp
-p 5432 -d namem -U username -c
"truncate table schema.example;
\copy schema.example from
'/home/testing/schema/schema.example_export.csv'
with header null as '' escape 'off' delimiter E',' ;
select count(*) from schema.example;"
However I am getting the error:
ERROR: syntax error at or near "\"
I believe this is due to the \ commands have to be on a separate line.
Is there a way to split the command into separate lines so I can execute ll the commands in a single connection?
The problem is that you can't separate backslash commands from other commands if you are using the -c option. You can send your commands via STDIN to psql using echo:
export PGPASSWORD=abcde;
echo "truncate table schema.example;
\copy schema.example from '/home/testing/schema/schema.example_export.csv' with header null as '' escape 'off' delimiter E',' ;
select count(*) from schema.example;" | psql -h abcde.testserver.corp -p 5432 -d namem -U username
That's a little bit clumsy. It's better to use subprocess.Popen
theCommand = """truncate table schema.example;
\copy schema.example from
'/home/testing/schema/schema.example_export.csv'
with header null as '' escape 'off' delimiter E',' ;
select count(*) from schema.example;"""
theProcess = subprocess.Popen("psql -h abcde.testserver.corp -p 5432 -d namem -U username",
stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
theOutput, theErrors = theProcess.communicate(input = theCommand)
But the best way should be avoiding shell commands and using a database adaptor like PyGreSQL.

TypeError: isfile() missing 1 required positional argument : 'remotepath'

with pysftp.Connection(ipaddr, username="uname", password="pass", cnopts=cnopts) as sftp:
sftp.put(uploc + ufile, "/home/pi/PIFTP/dloads/" + ufile)
checkfile = ("/home/pi/PIFTP/dloads/" + ufile)
chfile = pysftp.Connection.isfile(checkfile)
if chfile == True:
print (Style.BRIGHT + "[" + Fore.GREEN + "OK" + Fore.WHITE + "] ")
else:
print (Style.BRIGHT + Fore.RED + ipaddr + " is unacsessible")
As you can see I'm trying to check a file that is just uploaded. In this case "/home/pi/PIFTP/dloads/" + ufile is file's remote download path. What am I missing? Thanks.
Also file arrives before error.
You'll need to use the Connection instance sftp you have, not the class.
chfile = sftp.isfile(checkfile)

python IndexError: list index out of range on arp_scan script

I'm getting this error when running my code:
prefix = ip.split('.')[0] + '.' + ip.split('.')[1] + '.' + ip.split('.')[2] + '.'
IndexError: list index out of range
I didn't write the script myself. I need to perform an arp scan on intranet and get as output every IP-addr that responds. My code is this:
#!/usr/bin/python
import logging
import subprocess
logging.getLogger("scapy.runtime").setLevel(logging.ERROR)
from scapy.all import *
if len(sys.argv) != 2:
print("Usage - ./arp_disc.py [interface]")
print("Example - ./arp_disc.py eth0")
print("Example will perform an ARP scan of the local subnet to which eth0 is assigned")
sys.exit()
interface = str(sys.argv[1])
ip = subprocess.check_output("ifconfig " + interface + " | grep 'inet addr' | cut -d ':' -f 2 | cut -d ' ' -f 1", shell=True).strip()
prefix = ip.split('.')[0] + '.' + ip.split('.')[1] + '.' + ip.split('.')[2] + '.'
for addr in range(0,254):
answer=sr1(ARP(pdst=prefix+str(addr)),timeout=1,verbose=0)
if answer == None:
pass
else:
print prefix+str(addr)
Use the following instead:
ip = subprocess.check_output("ifconfig " + interface + " | grep 'inet' | cut -d ':' -f 2 | cut -d '' -f 1", shell=True).strip()
ip= ip[5:]

python string replacement str type error dynamically building aws user-data script

The problem: I'm trying to dynamically build a python user-data script for amazon in a jenkins deploy script and pass it to an ASG to be executed at runtime. I pass my vars to the deploy script and then dynamically create the python script based on arguments.
I'm getting an unexpected string replacement error and I'm not entirely sure why handoff.sh is what passed the arguments from jenkins to the deploy script:
the error:
[deploy-and-configure-test] $ /bin/sh -xe /tmp/hudson8978997207867591628.sh
+ sh /var/lib/jenkins/workspace/deploy-and-configure-test/handoff.sh
Traceback (most recent call last):
File "/var/lib/jenkins/workspace/deploy-and-configure-test/asgBuilder.py", line 393, in <module>
''' % (str(repo), str(playbook),str(user_data_ins), str(in_user_data)))
TypeError: %u format: a number is required, not str
the dynamic portion of my deploy script:
in_user_data = args.in_user_data
playbook = args.playbook
repo = args.repo
user_data_ins = ('''export CLOUD_ENVIRONMENT=%s\n
export CLOUD_MONITOR_BUCKET=%s\n
export CLOUD_APP=%s\n
export CLOUD_STACK=%s\n
export CLOUD_CLUSTER=%s\n
export CLOUD_AUTO_SCALE_GROUP=%s\n
export CLOUD_LAUNCH_CONFIG=%s\n
export EC2_REGION=%s\n
export CLOUD_DEV_PHASE=%s\n
export CLOUD_REVISION=%s\n
export CLOUD_DOMAIN=%s\n
export SG_GROUP=%s\n''' % (cloud_environment,
cluster_monitor_bucket,
cluster_name,
cloud_stack,
cloud_cluster,
cloud_auto_scale_group,
cloud_launch_config,
provider_region,
cloud_dev_phase,
cloud_revision,
cloud_domain,
export_env_sg_name))
user_data_ins = ('''
#!/usr/bin/python
import os
import subprocess
import time
import uuid
def shell_command_execute(command):
p = subprocess.Popen(command, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
print output
return output
repo = "%s"
playbook = "%s"
echo_bash_profile = "echo %s >> ~/.bash_profile" % user_echo
shell_command_execute(echo_bash_profile)
var_user_data = "%s"
for varb in var_user_data.split('|'):
echo_bash_profile_passed = "echo " + varb + " >> ~/.bash_profile"
shell_command_execute(echo_bash_profile_passed)
command = 'git clone ' + repo
shell_command_execute(command)
folder = repo.split('/')[4].replace('.git','')
#https://github.com/test/test.git # replaced for security.
execute_playbook = ('ansible-playbook -i "localhost," -c local' + '/' + os.path.dirname(os.path.realpath(__file__)) + '/' + folder + '/' + playbook >> ansible.log')
print execute_playbook
shell_command_execute(execute_playbook)
''' % (str(repo), str(playbook),str(user_data_ins), str(in_user_data)))
text_file = open("user-data.py", "wa")
text_file.write(user_data_ins)
text_file.close()
lc_user_data = '${file("%s/user-data.py")}' %wd
updated still not working
user_data_ins = ('''export CLOUD_ENVIRONMENT=%s\n
export CLOUD_MONITOR_BUCKET=%s\n
export CLOUD_APP=%s\n
export CLOUD_STACK=%s\n
export CLOUD_CLUSTER=%s\n
export CLOUD_AUTO_SCALE_GROUP=%s\n
export CLOUD_LAUNCH_CONFIG=%s\n
export EC2_REGION=%s\n
export CLOUD_DEV_PHASE=%s\n
export CLOUD_REVISION=%s\n
export CLOUD_DOMAIN=%s\n
export SG_GROUP=%s\n''' % (cloud_environment,
cluster_monitor_bucket,
cluster_name,
cloud_stack,
cloud_cluster,
cloud_auto_scale_group,
cloud_launch_config,
provider_region,
cloud_dev_phase,
cloud_revision,
cloud_domain,
export_env_sg_name))
user_data_ins = ('''
#!/usr/bin/python
import os
import subprocess
import time
import uuid
def shell_command_execute(command):
p = subprocess.Popen(command, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
print output
return output
repo = "%s"
playbook = "%s"
echo_bash_profile = "echo %s >> ~/.bash_profile" % user_echo
shell_command_execute(echo_bash_profile)
var_user_data = "%s"
for varb in var_user_data.split('|'):
echo_bash_profile_passed = "echo " + varb + " >> ~/.bash_profile"
shell_command_execute(echo_bash_profile_passed)
command = 'git clone ' + repo
shell_command_execute(command)
folder = repo.split('/')[4].replace('.git','')
#https://github.com/zukeru/vision_provis.git
execute_playbook = ('ansible-playbook -i "localhost," -c local' + '/' + os.path.dirname(os.path.realpath(__file__)) + '/' + folder + '/' + playbook >> ansible.log')
print execute_playbook
shell_command_execute(execute_playbook)
''' % (str(repo), str(playbook),str(user_data_ins), str(in_user_data)))
text_file = open("user-data.py", "wa")
text_file.write(user_data_ins)
text_file.close()
lc_user_data = '${file("%s/user-data.py")}' %wd
#Grant Zukel
I would recommend doing the following.
In the last line change to
'''.format (str(repo), str(playbook),str(user_data_ins), str(in_user_data)))
And in your code change your first %s to {0} which would be str(repo) and every subsequent would be {1}... {2} etc
The problem is you have string replacement inside the string.
Whenever you have this, you need to have double percent:
echo_bash_profile = "echo %s >> ~/.bash_profile" %% user_echo
It is this line that is causing the error
'''bash_profile % user_echo'''
I would recommend using the string.format method if you are using python 2.6 or higher
Try this:
user_data_ins = ('''
#!/usr/bin/python
import os
import subprocess
import time
import uuid
def shell_command_execute(command):
p = subprocess.Popen(command, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
print output
return output
repo = "{0}"
playbook = "{1}"
echo_bash_profile = "echo {2} >> ~/.bash_profile" % user_echo
shell_command_execute(echo_bash_profile)
var_user_data = "{3}"
for varb in var_user_data.split('|'):
echo_bash_profile_passed = "echo " + varb + " >> ~/.bash_profile"
shell_command_execute(echo_bash_profile_passed)
command = 'git clone ' + repo
shell_command_execute(command)
folder = repo.split('/')[4].replace('.git','')
#https://github.com/zukeru/vision_provis.git
execute_playbook = ('ansible-playbook -i "localhost," -c local' + '/' + os.path.dirname(os.path.realpath(__file__)) + '/' + folder + '/' + playbook >> ansible.log')
print execute_playbook
shell_command_execute(execute_playbook)
'''.format(str(repo), str(playbook),str(user_data_ins), str(in_user_data)))
This line seems to be causing the issue:
echo_bash_profile = "echo %s >> ~/.bash_profile" % user_echo
Likely it sees the % user as %u.
Ok so #FirebladDan you were right i miseed one here is the working code:
user_data_ins = ('''
#!/usr/bin/python
import os
import subprocess
import time
import uuid
def shell_command_execute(command):
p = subprocess.Popen(command, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
print output
return output
repo = "%s"
playbook = "%s"
echo_bash_profile = "echo " + %s + " >> ~/.bash_profile"
shell_command_execute(echo_bash_profile)
var_user_data = "%s"
for varb in var_user_data.split('|'):
echo_bash_profile_passed = "echo " + varb + " >> ~/.bash_profile"
shell_command_execute(echo_bash_profile_passed)
command = 'git clone ' + repo
shell_command_execute(command)
folder = repo.split('/')[4].replace('.git','')
#https://github.com/zukeru/vision_provis.git
execute_playbook = ('ansible-playbook -i "localhost," -c local' + '/' + os.path.dirname(os.path.realpath(__file__)) + '/' + folder + '/' + playbook >> ansible.log')
print execute_playbook
shell_command_execute(execute_playbook)
''' % (str(repo), str(playbook),str(user_data_ins), str(in_user_data)))
text_file = open("user-data.py", "wa")
text_file.write(user_data_ins)
text_file.close()
lc_user_data = '${file("%s/user-data.py")}' %wd

python multithreaded ssh app

I have an app I've scraped together to try and spawn 3 threads and ssh into a server simultaneously.
I wrote an obviously offensively coded application which I know is wrong which I am looking for some guidance for, to accomplish my initial end goal as mentioned above.
For the argument passing, I know I need to finesse it with something like cmd or cmd2 later on but for now that's not my primary concern.
I know that right now I'm spawning a subprocess and doing things serially. I look forward to your replies.
#!/usr/bin/python
import sys, os
import subprocess
if (len(sys.argv) > 1):
if( sys.argv[1] == 'start' ):
print "starting jboss"
arg = str(sys.argv[1])
elif( sys.argv[1] == 'stop' ):
print "stopping"
elif( sys.argv[1] == 'status' ):
print "getting status"
arg = str(sys.argv[1])
print arg
else:
print "start or stop?"
exit(1)
else:
print "unexpected error"
host = "10.24.14.10 "
command = "sudo /etc/init.d/jbossas " + arg
fullcommand = "ssh " + " " + host + " " + " " + command + " "+ arg
print "full command: ", fullcommand
process = subprocess.Popen(fullcommand, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output,stderr = process.communicate()
status = process.poll()
print output
host = "10.24.14.20 "
fullcommand = "ssh " + " " + host + " " + " " + command + " "+ arg
process = subprocess.Popen(fullcommand, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output,stderr = process.communicate()
status = process.poll()
print output
host = "10.30.1.1 "
fullcommand = "ssh " + " " + host + " " + " " + command + " "+ arg
process = subprocess.Popen(fullcommand, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output,stderr = process.communicate()
status = process.poll()
print output

Categories