I'm learning python2.7 in Windows 8.1.
I used echo "XXX" to create a new file, and then I used python to read it, but it output unreadable files.
But when I used GUI to type something into a new file then read it in python it works.
Is there anyone could help me please ?
I wrote this in powershell 2.0
PS C:\Users\Fingal> echo "Stack Overflow is such a good site." > test.txt
And then I wrote this in Notepad++, and save it as test.txt in C:\Users\Fingal
from sys import argv
script, file_name = argv
txt = open(file_name)
print txt.read()
txt.close()
Then I back to powershell and type:
PS C:\Users\Fingal\> python test.py test.txt
a ataaacaka a0avaearafalaoawa aiasa asauacaha aaa agaoaoada asaiattaea.aa
It output unreadable code.
Fingal
Related
I would like to save the input code and the output result into a file. For example the following python code code.py:
print(2+2)
print(3+2)
to create a code-and-output.txt:
>>> print(2+2)
4
>>> print(3+2)
5
But I can not get it working. Basically, I want to code-and-output.txt to capture what would happen if I run interpreted python and run statements in python interactive environment (code + output).
Ways that I have tried so far:
Redirect stdout:
python code.py > code-and-output.txt
It only saves the output.
Redirect stdout and stdin:
python < code.py > code-and-output.txt
It does the same (only output).
nohup
nohup python code.py
The same problem: only output.
Script
script -q code-and-output.txt
python
print(2+2)
print(2+3)
ctr+d
ctr+d
It works but I need to do it manually. Moreover, it saves some garbage that I can not make them quiet with -q.
Bash Script
# bash-file.sh
python &
print(2+2)
print(2+3)
Does not work: commands run in console bash, not python. It does not work with & either: never ends python repl.
Using tty
open another terminal like /dev/pts/2 and send above bash-file.sh
cat bash-file.sh > /dev/pts/2
It just copies but does not run.
I am not interested in solutions like Jupyter and iPython. They have their own problems that does not address my requirement.
Any solution through linux commands (preferably) or python? Thank you.
Save this is as repl_sim.py in the same directory as your code.py:
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with the following if you want to redirect the output to a text file named code-and-output.txt:
python repl_sim.py > code-and-output.txt
-OR-
Then run in your terminal with the following if you want to see the output as well as the make the text file matching:
python repl_sim.py | tee code-and-output.txt
It at least works for the example you provided as code.py.
Pure Python version of first option above so that you don't need shell redirect.
Save this code as repl_sim.py:
import contextlib
with open('code-and-output.txt', 'w') as f:
with contextlib.redirect_stdout(f):
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with:
python repl_sim.py
That will result in code-and-output.txt with your desired content.
Contextlib use based on Raymond Hettinger's August 17th 2018 Tweet and contextlib.redirect_stdout() documentation
.
I have a large python script. It prints a bunch of output on my terminal console. The problem is the print is not happening altogether. Some print statements print one blob of statements together, then under that some other part of code prints some stuff. It goes on as long as the main loop runs.
Issue is I get the output as I want but all is getting printed on console as that is where we are running the python main script.
It would be very helpful if along with the print happening at console, I can get all the output in console in same format to a text file also for retention.
Again, there are bunch of print statements occurring in different parts of the whole script. So not sure how to retain the whole output of console in same format to a final text file.
If you want to do the redirection within the Python script, setting sys.stdout to a file object does the trick:
import sys
sys.stdout = open('file', 'w')
print('test')
A far more common method is to use shell redirection when executing (same on Windows and Linux):
$ python foo.py > file
Check this thread Redirect stdout to a file in Python?
Custom Print function for both console and file, replace all print with printing in the code.
outputFile = open('outputfile.log', 'w')
def printing(text):
print(text)
if outputFile:
outputFile.write(str(text))
you have to add file argument to the print() function
print('whatever', file = file_name)
I would rather go ahead with bash and use tee command. It redirects the output to a file too.
python -u my.py | tee my_file.txt
If your python script is file.py, Then use :
python3 file.py > output.txt
Or
python file.py > output.txt
Depending on your python version. This statement (>) will all the outputs of the program into the stdout to the file, output.txt
EDIT :
python3 file.py > output.txt;cat output.txt
The above line can be used to print the file output.txt after the program execution.
EDIT2 :
Another possible option to use a custom print function :
f = open('output.txt')
def custom_print(e = '\n',*s)
for i in s[:-1]:
print(i,end=' ')
print(s[-1],end = e)
f.write(s)
#Your code
#
f.close()
So I've written a pretty basic script as part of a larger test of my system. Essentially, it just creates a new file text.txt and writes "abject failure" to the file.
#!/usr/bin/env python
print "running"
text_file = open("text.txt", "a")
text_file.write("abject failure")
text_file.close()
print "success"
The script works wonderfully when I run from IDLE, but when I run from the Mac OS X terminal with
python /Users/.../serverside.py
I only get the "print" values but nothing is written to the text.txt file. Why is this and how can I fix it?
To make sure that your script always works, no matter where your call is coming from, also set an absolute path to your file
text_file = open("/Users/path/to/file/text.txt", "a")
The other solution, as stated in the comment, is to change your working directory to the path where you want the file to be
cd /Users/path/to/file
python /Users/.../serverside.py
I checked out Reading stdout from one program in another program but did not find the answer I was looking for
I'm new to Linux and i'm using the argparse module in Python to run arguments with a program through the terminal on my Mac
I have program_1.py that inputs a file via sys.stdin and outputs data to sys.stdout
I'm trying to get program_2.py to take in this data that was outputted to sys.stdout from program_1.py and take it in as it's sys.stdin
I tried something along the lines of:
Mu$ python program-1.py <sample.txt> program-2.py
For simplicity, let's say that 'sample.txt' just had the string '1.6180339887'
How can program_2.py read from sys.stdout of the previous program as it's sys.stdin?
In this simple example, I am just trying to get program_2.py to output '1.6180339887' to sys.stdout so I can see how this works.
Somebody told me to use the | character for pipelines but I couldn't make it work correctly
Using a pipe is correct:
python program-1.py sample.txt | python program-2.py
Here's a complete example:
$ cat sample.txt
hello
$ cat program-1.py
import sys
print open(sys.argv[1]).read()
$ cat program-2.py
import sys
print("program-2.py on stdin got: " + sys.stdin.read())
$ python program-1.py sample.txt
hello
$ python program-1.py sample.txt | python program-2.py
program-2.py on stdin got: hello
(PS: you could have included a complete test case in your question. That way, people could say what you did wrong instead of writing their own)
program-1.py:
import sys
if len(sys.argv) == 2:
with open(sys.argv[1], 'r') as f:
sys.stdout.write(f.read())
program-2.py:
import sys
ret = sys.stdin.readline()
print ret
sample.txt
1.6180339887
in your shell:
Mu$ python p1.py txt | python p2.py
output:
1.6180339887
more info here:
http://en.wikipedia.org/wiki/Pipeline_(Unix) and here http://en.wikibooks.org/wiki/Python_Programming/Input_and_Output
I have a simple python script that just takes in a filename, and spits out a modified version of that file. I would like to redirect stdout (using '>' from the command line) so that I can use my script to overwrite a file with my modifications, e.g. python myScript.py test.txt > test.txt
When I do this, the resulting test.txt does not contain any text from the original test.txt - just the additions made by myScript.py. However, if I don't redirect stdout, then the modifications come out correctly.
To be more specific, here's an example:
myScript.py:
#!/usr/bin/python
import sys
fileName = sys.argv[1]
sys.stderr.write('opening ' + fileName + '\n')
fileHandle = file(fileName)
currFile = fileHandle.read()
fileHandle.close()
sys.stdout.write('MODIFYING\n\n' + currFile + '\n\nMODIFIED!\n')
test.txt
Hello World
Result of python myScript.py test.txt > test.txt:
MODIFYING
MODIFIED!
The reason it works that way is that, before Python even starts, Bash interprets the redirection operator and opens an output stream to write stdout to the file. That operation truncates the file to size 0 - in other words, it clears the contents of the file. So by the time your Python script starts, it sees an empty input file.
The simplest solution is to redirect stdout to a different file, and then afterwards, rename it to the original filename.
python myScript.py test.txt > test.out && mv test.out test.txt
Alternatively, you could alter your Python script to write the modified data back to the file itself, so you wouldn't have to redirect standard output at all.
Try redirecting it to a new file, the redirect operator probably deletes the file just before appending.
The utility sponge present in the moreutils package in Debian can handle this gracefully.
python myScript.py test.txt | sponge test.txt
As indicated by its name, sponge will drain its standard input completely before opening test.txt and writing out the full contents of stdin.
You can get the latest version of sponge here. The moreutils home page is here.