Cannot run MongoDB code through Python evnoy - python

I am trying to import data into MongoDB through Python envoy. However it keeps showing "permission denied". After I have changed the permission of the data file, it keeps showing "Exec format error".
However, when I ran the same command on MongoDB Shell, everything works well.
Do you know how can I run the commands through Python envoy?
Below is my code:
def load_data():
data_file = os.path.join(os.getcwd(), 'enron.mbox.json')
print data_file
r = envoy.run('mongoimport --db enron --collection mbox --drop --file %s' % data_file)
print r.std_out
print sys.stderr.write(r.std_err)

Related

sh: 1: Syntax error: redirection unexpected Python/Bash

I have problem with my python cmd script.
I don't know why it does not work. Maybe something wrong with my code.
Im trying to run the program in cmdline through my python script.
And Im getting error in bash "sh: 1: Syntax error: redirection unexpected"
pls help Im just biologist :)
Im using spyder (anaconda)/Ubuntu
#!/usr/bin/python
import sys
import os
input_ = sys.argv[1]
output_file = open(sys.argv[2],'a+')
names = input_.rsplit('.')
for name in names:
os.system("esearch -db pubmed -query %s | efetch -format xml | xtract -pattern PubmedArticle -element AbstractText >> %s" % (name, output_file))
print("------------------------------------------")
output_file is a file object. When you do "%s" % output_file, the resulting string is something like "<open file 'filename', mode 'a+' at 0x7f1234567890>". This means that the os.system call is running a command like
command... >> <open file 'filename', mode 'a+' at 0x7f1234567890>
The < after the >> causes the "Syntax error: redirection unexpected" error message.
To fix that, don't open the output file in your Python script, just use the filename:
output_file = sys.argv[2]
I got similar error on following line:
os.system('logger Status changed on %s' %s repr(datetime.now())
Indeed, as nomadictype stated the problem is in running plain OS command. The command may include special characters. In my case this was <.
So instead of changing OS command significantly, I just added quotes and this works:
os.system('logger "Status changed on %s"' %s repr(datetime.now())
Quotes make content of passed parameter invisible for shell.

Keyerror in reading csv file

I am trying to write a script where I pass file name as argument from shell script to python script and python script processes that script.It is giving me keyerror but if I run the same script hardcoding the file name it works fine.
#!/bin/sh
LOCKFILE=./test.txt
if [ -e ${LOCKFILE} ] && kill -0 `cat ${LOCKFILE}`; then
echo "already running"
exit
fi
trap "rm -f ${LOCKFILE}; exit" INT TERM EXIT
echo $$ > ${LOCKFILE}
# do stuff
FILES=/home/sugoi/script/csv/*
for file in $FILES
do
python ./csvTest.py $file
#mv $file ./archive
done
rm -f ${LOCKFILE}
exit
Python:
from pymongo import MongoClient
import csv
import json
import sys
client = MongoClient()
db = client.test
for arg in sys.argv:
try:
csvfile = open(arg, 'r')#if i hardcode file name here it works fine
except IOError as e:
#write to error log
sys.exit(100)
reader = csv.DictReader(csvfile)
header=reader.next()
for each in reader:
row={}
for field in header:
row[field]=each[field]
db.test.update({"_id": row["CustomerId"]}, {"$push": {"activities":{"action": row["Action"],"date" :row["Timestamp"],"productId":row["productId"]}}},True)
What am I doing wrong ?
Two issues.
Your shell script isn't expanding the file list correctly.
FILES=/home/sugoi/script/csv/* needs to be something like:
FILES=`ls -1 /home/sugoi/script/csv/*;`
Your argument to the python script will only be one file at a time, so why loop through sys.argv?
Just use the argument itself, sys.argv[1]. As #Brian Besmanoff pointed out, that needs to be indexed 1 because the script name itself is stored in sys.argv[0].
try:
csvfile = open(sys.argv[1], 'r')
except IOError as e:
(...)
Finally: you can just parse directories with Python instead of looping in a shell script. Look at the os module, particularly os.listdir(). A little more work and you can have the whole thing running inside one Python script instead of juggling between shell and calling a script.
The first value in sys.argv is going to be the name of the script. reference

How can I execute "source FileName.sql" in a python script?

I would like to execute the MySQL query source FileName.sql in a Python script on Linux.
I am able to execute other queries like SELECT * FROM table_name but this one is giving an error. I am executing this on a Linux server with a MySQL database using Python. The frontend I am using is Putty.
The Python script I have used is:
import MySQLdb
db = MySQLdb.connect("hostname","username","pswrd","dbname")
cursor = db.cursor()
cursor.execute("source FileName.sql")
db.close()
How can I execute the query source FileName.sql on the location where this file-> FileName.sql is located?
source is not a SQL command. It's a MySQL CLI command, it only exists in the console application mysql (and wherever else implemented). All it does is to read the contents of FileName.sql and issue the SQL commands inside.
To do this in python, you can use something like
Edit: This assumes you have 1 query per line! If you have multi-line queries, you'll have to find the means to extract each query from the file.
import MySQLdb
db = MySQLdb.connect("hostname","user","pass","db")
cursor = db.cursor()
for line in open("FileName.sql"):
cursor.execute(line)
db.close()
You can execute a bash command with Python and import your SQL file.
This exemple is for MySQL
import subprocess
command = "mysql -u username --password=p#55W0rD database_name < file.sql".split()
p = subprocess.Popen(command, stdout=subprocess.PIPE)
p.communicate() # you can see if errors are returned
if your SQL file creates a database, remove database_name.
sources:
https://docs.python.org/3/library/subprocess.html#popen-constructor
https://dev.mysql.com/doc/refman/8.0/en/mysql-batch-commands.html
Separate the scripts in SQL file in python using ";" as delimiter
Execute each command iteratively.
awesome5team had developed a nice solution in https://github.com/awesome5team/General-Resources-Box/issues/7
Code snippet from the same:
import mysql.connector
cnx = mysql.connector.connect(user='root',
password='YOUR-PASSWORD-FOR-MYSQL',
host='localhost',
database='YOUR-DATABASE-NAME')
cursor =cnx.cursor()
def executeScriptsFromFile(filename):
fd = open(filename, 'r')
sqlFile = fd.read()
fd.close()
sqlCommands = sqlFile.split(';')
for command in sqlCommands:
try:
if command.strip() != '':
cursor.execute(command)
except IOError, msg:
print "Command skipped: ", msg
executeScriptsFromFile('SQL-FILE-LOCATION')
cnx.commit()

python copy file in local network (linux -> linux) and output

I'm trying to write a script to copy files in my RaspberryPi, from my Desktop PC.
Here is my code: (a part)
print "start the copy"
path_pi = '//192.168.2.2:22/home/pi/Stock/'
file_pc = path_file + "/" + file
print "the file to copy is: ", file_pc
shutil.copy2(file_pc, path_pi + file_pi)
Actually I have this error: (in french)
IOError: [Errno 2] Aucun fichier ou dossier de ce type: '//192.168.2.2:22/home/pi/Stock/exemple.txt'
So, how could I proceed? Must I connect the 2 machines before trying to copy?
I have tryed with:
path_pi = r'//192.168.2.2:22/home/pi/Stock'
But the problem is the same. (And file_pc is a variable)
Thanks
Edit:
Ok, I found this:
command = 'scp', file_pc, file_pi
p = subprocess.Popen(command, stdout=subprocess.PIPE)
But no way to have the output... (work with Shell=False)
shutil.copy2() works with local files. 192.168.2.2:22 suggests that you want to copy files over ssh. You could mount the remote directory (RaspberryPi) onto a local directory on your desktop machine (sshfs) so that shutil.copy2() would work.
If you want to see the output of a command then don't set stdout=PIPE (note: if you set stdout=PIPE then you should read from p.stdout otherwise the process may block forever):
from subprocess import check_call
check_call(['scp', file_pc, file_pi])
scp will print to whatever places your parent Python script prints.
To get the output as a string:
from subprocess import check_output
output = check_output(['scp', file_pc, file_pi])
Though It looks like scp doesn't print anything by default if the output is redirected.
You could use pexpect to make scp think that it runs in a terminal:
import pipes
import re
import pexpect # $ pip install pexpect
def progress(locals):
# extract percents
print(int(re.search(br'(\d+)%[^%]*$', locals['child'].after).group(1)))
command = "scp %s %s" % tuple(map(pipes.quote, [file_pc, file_pi]))
status = pexpect.run(command, events={r'\d+%': progress}, withexitstatus=1)[1]
print("Exit status %d" % status)
Do you have SSH enabled? Something like this could help you:
import os
os.system("scp FILE USER#SERVER:PATH")

os.system call not working?

I am using the following code to call two different scripts:
soup=BeautifulSoup(htmltext)
title=soup.find('title')
os.system("python scraper/updatedb.py %s" % (title))
os.system("python scraper/insertlinks.py %s" % (tag['href']))
The second one is running but the first one is not running. Please help.
updatedb.py and insertlinks.py work fine when run individually.
updatedb.py is as follows:
import sqlite3 as db
import sys
print "inserted"
key=sys.argv[1]
id="1"
conn=db.connect('store.db')
cursor=conn.cursor()
with conn:
cursor.execute('insert into records (id,keyword) values(?,?)',(id,key))
conn.close()
Os.system always returns the code, error code if not executed the command and 0 if it is executed successfully.
result_code = os.system("your command")
you can check the error message on google by using python error code

Categories