python script leaves zombie processes [closed] - python

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
So I ran some python script on some files by
ls * | xargs python myscript.py
After the process finished normally, I looked at the process by
ps aux
I find that the process is still in the list. I have to manually kill it. Why is this? Here is my python script.
import sys
filex = open(argv[1])
for line in filex:
do sth here.

Unless you have skipped it in your post are you not calling close() on the file afterwards?
Try calling filex.close() at the end or alternatively use 'with':
with open(argv[1]) as filex:
for line in filex:
#do the things

Related

How to get the terminal full history [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 months ago.
Improve this question
I am trying to save everything my python app printed in the cmd, every solution I found was to get the response of a command line using Popen from Subprocess; what I want is to get everything. Is there a built-in function for this purpose? or should I do it manually, which I don't prefer?
What you want to do, is log the output of stdout, see this answer:
Making Python loggers output all messages to stdout in addition to log file
Why don't you use this from terminal
python main.py > output.log
you can try the script command in your shell. It saves everything printed out to your shell from the moment you call it. End the command with exit. you can call it from your python script, hope it helps.
script manual page

How to debug with exec in Python [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 months ago.
Improve this question
I have a file with the code and I want to run it using exec(file.read()). However when I put the breakpoint in that file then it's not reached. How can I run this file and make the breakpoint work?
That's not the usual way of running Python files, but if you have a breakpoint() in the middle of the file, it should work.
I think you actually want to import the file or run it directly.

Testing vi Editor Functionality with Python? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
How do I write an automated test that can check the vi editor is properly working programmed using Python?
Save your script containing vi command as follows.
cat script.sh
vi abc.txt <<INPUT
i
Line 1
Line 2
^[
ZZ
INPUT
Use the python subprocess.check_call to check the status of execution
subprocess.check_call(["script.sh" , "arg1"])
This will Run the command(script.sh) with arguments(if needed). Waits for command to complete. If the return code was zero then returns,
otherwise raises CalledProcessError Exception.

How can I beautify all files using Python script or some other ways? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have a lot of unindented C/C++ source files.I want to beautify all of them programmatically. How can I achieve that?
Write a python script that calls a C++ beautifier like Artistic Style.
http://astyle.sourceforge.net/
Here's how your python script would look:
# Collect all the files you need to beautify in "files"
#Loop through each C/C++ file
for file in files:
command = "astyle " + file
import subprocess
process = subprocess.Popen(command.split(), stdout=subprocess.PIPE)
output, error = process.communicate()
Example usage of Artistic Style is:
astyle example.cpp
or
astyle --style=ansi --indent=tab example.cpp

Run command in background with Fabric [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to run a background script with Fabric using Bash's '&' operator. Is there a reason the following doesn't work? I executed this command on the server itself and it runs fine.
#task
def run_script():
sudo('sh /home/ubuntu/wlmngcntl.sh start &', user='myuser')
I don't want to use something heavy like Celery to do this simple thing. I don't need to capture the output at all, all I want is for the task to execute this and return after.
This isn't a Fabric thing, but a Linux thing. When you close a session, the processes connected to that session are terminated.
This question has a lot of info... https://askubuntu.com/questions/8653/how-to-keep-processes-running-after-ending-ssh-session
You could use the following (from that answer)
sudo('nohup sh /home/ubuntu/wlmngcntl.sh start &', user='myuser')

Categories