My python function won't write to a file - python

I've made a function for a flask application to create a decorator and a function and then write them to a file but when I run it, it doesn't create a file and write to it and it doesn't return any errors.
def make_route(title):
route = "#app.route(/%s)" %(title)
def welcome():
return render_template("%s.html" %(title))
return welcome
f = open('test1.txt', 'w')
f.write(route, '/n', welcome, '/n')
f.close()
make_route('Hi')

A return statement terminates execution of the function, so any code after it is ignored. Also, write writes a string, not random objects. You want:
def make_route(title):
route = "#app.route(/%s)" %(title)
def welcome():
return render_template("%s.html" %(title))
with open('test1.txt', 'w') as f:
f.write('%r\n%r\n' % (route, welcome))
return welcome
make_route('Hi')

I would use philhag answer but use %s instead of %r or you'll write a string, and you could use .name if you want to use the function more than once(Which you probably do).
def make_route(title):
route = "#app.route('/%s')" %(title)
def welcome():
return render_template("%s.html" %(title))
with open('test2.py', 'w') as f:
f.write('%s\n%s\n' % (route, welcome))
welcome.__name__ = title
return welcome
make_route('Hi')

Related

print key into txt file

So I am working on a script to generate serialnumbers for a product. I want to make a txt file where I the script prints the generated key. somehow it cant print in there but I don't know what I need to changes about it.
key = Key('aaaa-bbbb-cccc-dddd-1111')
fh = open('key.txt')
fh.write(Key)
Ok, based on your response, I've mocked up the Key class as follows. Without more information, it's not possible to give you a definitive answer, but hopefully this helps!
class Key:
def __init__(self, serial):
self.serial = serial
def process_serial(self):
# Your processing here
...
return processed_serial # This should be a string
Then to write to file, you can do:
key = Key('aaaa-bbbb-cccc-dddd-1111')
with open('key.txt', 'w') as f:
f.write(key.process_serial())
Alternatively, you can add a __str__ method to your class, which will specify what happens when you call the Python builtin str on your object.
class Key:
def __init__(self, serial):
self.serial = serial
def __str__(self):
out = ... # construct what you want to write to file
return out
Giving:
key = Key('aaaa-bbbb-cccc-dddd-1111')
with open('key.txt', 'w') as f:
f.write(str(key))
You might also consider adding this as a method to your Key class
class Key:
__init__(self, serial):
self.serial = serial
def process_serial(self):
# Your processing here
...
return processed_serial # This should be a string
def write(self, file_name):
with open(file_name, 'w') as f:
f.write(self.process_serial)
Try:
key = "Key('aaaa-bbbb-cccc-dddd-1111')"
fh = open('key.txt', "w")
fh.write(key)
To generate a text file that doesn't already exist you need to use "w" .
Try doing:
key = Key('aaaa-bbbb-cccc-dddd-1111')
with open('key.txt', 'w') as fh:
fh.write(key)
Hope that Helps!
Note: it must be in the with ... so it writes, if its not there the file is considered as closed.

Python: How to use write/export to file in a different function?

I'm trying to write a export to file function in Python. Here's a snippet:
#!/usr/bin/env python
def function_1(x):
for i in range(len(x)):
ip = packet[i]
mac = mac[i]
print '{}, {}'.format(ip, mac)
def export()
f=open("data.txt", "w")
f.write() #f.write(function_1(x))??
f.close()
if __name__ == "__main__":
function_1(10)
export()
I can do write to a file by putting it in the same function as function_1, but how would I do it in another function? Do I need to return?
Yes, f.write needs a string for input. The easiest way is to create the string directly in your function and return it:
def function_1(x):
ret_val = ""
for i in range(len(x)):
ip = packet[i]
mac = mac[i]
ret_val += '{}, {}\n'.format(ip, mac)
return ret_val

exposing the inner methods of a class, and using them

Let's say I have a class like so:
class Shell:
def cat(self, file):
try:
with open(file, 'r') as f:
print f.read()
except IOError:
raise IOError('invalid file location: {}'.format(f))
def echo(self, message):
print message
def ls(self, path):
print os.listdir(path)
In a javascript context, you might be able to do something like "Class"[method_name](), depending on how things were structured. I am looking for something similar in python to make this a "simulated operating system". EG:
import os
def runShell(user_name):
user_input = None
shell = Shell()
while(user_input != 'exit' or user_input != 'quit'):
user_input = raw_input('$'+ user_name + ': ')
...
now, the idea is they can type in something like this...
$crow: cat ../my_text
... and behind the scenes, we get this:
shell.cat('../my_text')
Similarly, I would like to be able to print all method definitions that exist within that class when they type help. EG:
$crow: help\n
> cat (file)
> echo (message)
> ls (path)
is such a thing achievable in python?
You can use the built-in function vars to expose all the members of an object. That's maybe the simplest way to list those for your users. If you're only planning to print to stdout, you could also just call help(shell), which will print your class members along with docstrings and so on. help is really only intended for the interactive interpreter, though, so you'd likely be better off writing your own help-outputter using vars and the __doc__ attribute that's magically added to objects with docstrings. For example:
class Shell(object):
def m(self):
'''Docstring of C#m.'''
return 1
def t(self, a):
'''Docstring of C#t'''
return 2
for name, obj in dict(vars(Shell)).items():
if not name.startswith('__'): #filter builtins
print(name, '::', obj.__doc__)
To pick out and execute a particular method of your object, you can use getattr, which grabs an attribute (if it exists) from an object, by name. For example, to select and run a simple function with no arguments:
fname = raw_input()
if hasattr(shell, fname):
func = getattr(shell, fname)
result = func()
else:
print('That function is not defined.')
Of course you could first tokenize the user input to pass arguments to your function as needed, like for your cat example:
user_input = raw_input().split() # tokenize
fname, *args = user_input #This use of *args syntax is not available prior to Py3
if hasattr(shell, fname):
func = getattr(shell, fname)
result = func(*args) #The *args syntax here is available back to at least 2.6
else:
print('That function is not defined.')

conditionally close file on exit from function

I have a (recursive) function which I would like to accept either a string or an opened file object. If the argument is a string, then the function opens a file and uses that file object. It seems best to close this opened file object explicitly when I return from the function, but only if a string was passed in. (Imagine the surprise from the user when they pass in an opened file object and find that their file object was closed somewhere). Here's what I'm currently using:
def read_file(f, param):
do_close = isinstance(f,basestring)
f = open(f, 'rb') if do_close else f
try:
info = f.read(4)
#check info here
if info == Info_I_Want(param):
return f.read(get_data(info))
else:
f.seek(goto_new_position(info))
return read_file(f,param)
except IKnowThisError:
return None
finally:
if do_close:
f.close()
You can assume that IKnowThisError will be raised at some point if I don't find the info I want.
This feels very kludgy. Is there a better way?
Why not wrapping your recursive function with a wrapper to avoid overhead ?
def read_file(f, param):
if isinstance(f, basestring):
with open(f, 'rb') as real_f:
return read_file2(real_f, param)
else:
return read_file2(real_f, param)
def read_file2(f, param):
# Now f should be a file object
...
How about calling your function recursively?
def read_file(f, param):
if isinstance(f, basestring):
with open(f, 'rb') as real_f:
return read_file(real_f, param)
else:
# normal path
The upcoming Python 3.3 offers a more general solution for this kind of problem, namely contextlib.ExitStack. This allow to conditionally add context managers to the current with-block:
def read_file(f, param):
with ExitStack() as stack:
if isinstance(f, basestring):
f = stack.enter_context(open(f, 'rb'))
# Your code here

Wrapper to write to multiple streams

In python, is there an easy way to set up a file-like object for writing that is actually backed by multiple output streams? For instance, I want something like this:
file1 = open("file1.txt", "w")
file2 = open("file2.txt", "w")
ostream = OStreamWrapper(file1, file2, sys.stdout)
#Write to both files and stdout at once:
ostream.write("ECHO!")
So what I'm looking for is OStreamWrapper. I know it'd be pretty easy to write my own, but if there's an existing one, I'd rather use that and not have to worry about finding and covering edge cases.
class OStreamWrapper(object):
def __init__(self, *streams):
self.streams = list(streams)
def write(self, string):
for stream in self.streams:
stream.write(string)
def writelines(self, lines):
# If you want to use stream.writelines(), you have
# to convert lines into a list/tuple as it could be
# a generator.
for line in lines:
for stream in self.streams:
stream.write(line)
def flush(self):
for stream in self.streams:
stream.flush()
Way to wrap all public file functions:
import sys
def _call_for_all_streams(func_name):
def wrapper(self, *args, **kwargs):
result = []
for stream in self._streams:
func = getattr(stream, func_name)
result.append(func(*args, **kwargs))
return result
return wrapper
class OStreamWrapper(object):
def __init__(self, *streams):
self._streams = streams
for method in filter(lambda x: not x.startswith('_'), dir(file)):
setattr(OStreamWrapper, method, _call_for_all_streams(method))
if __name__ == '__main__':
file1 = open("file1.txt", "w")
file2 = open("file2.txt", "w")
ostream = OStreamWrapper(file1, file2, sys.stdout)
ostream.write("ECHO!")
ostream.close()
But it's kinda dirty.
Logbook is another option although it is more than that. Its handlers are more powerful and you can combine whatever you like.

Categories