Is there a way to free used memory in Spyder? - python

I have scheduled a python web-scraper to run everyday at specified time. This puts load on spyder memory after a while results in system crash. Is there a way to solve this issue?

I had the same problem with using Spyder IDE for a process that I was running.
Things like deleting variables and using gc.collect() didn't work to free memory within a loop for my script on Ubuntu 20.04.
The way I got around having memory crash problems in long loops inside of Spyder was to run the script in terminal instead of Spyder using python my_script.py. This worked for me and my loops or long processes aren't crashing my memory anymore! Goodluck!

Related

Running multiple programs in parallel in Spyder

I want to run multiple programs in parallel in Spyder. How do I do it? I tried deactivating Use a single instance under Preferences---Applications but this didn't help.

D-Tale on Jupyter Notebook keeps running forever

I am using a Win10 Machine, python 3.7.3.
I recently started using the D-Tale python library for EDA on Jupyter Notebook.
The problem I am facing is that when I initialize it and run it the cell keeps running forever, which is only logical cause it's a process. But, cause of that I cannot run any cells below it. Am I doing something wrong?
I thought about creating a new notebook every time and running only D-Tale there, applying the code I need to a separate one. Any ideas from users who are familiar with this problem?
I really appreciate any help you can provide.

How to stop pycharm from overloading RAM?

ive recently started with python using pycharm. My problem is, that whenever i start pycharm it first starts normally but after about a minute ,without touching anything, i experience sind lags, which ist caused by my memory running on 97% load. This ist due to many python processes opening in the background. These are all called "Python (32 Bit)" and usw about 0.9 MB a pice. In total ist about 3-4gb of memory occupied by these processes. Whenever i close pycharm these processes dont close, which leads to ne restarting my Computer.
I have 8gb of memory, but CPU and GPU are running compleatly normal.
Since i only managed to use pygame with pycharm, another IDE isnt an option.
Dies anyone knows how to stop that from happening or how i atleast can kill all the background processes?
I'm not sure if it helps, but you can change the amount of memory being used by PyCharm IDE using Xmx in pycharm.vmoptions (JetBrains documentation). Note that you can simply kill the processes in the background manually in the task manager/terminal.

python 2.7 script crashing ubuntu 16.04 : how to find reason?

I run a complex python program that is computationally demanding.
While it is complex in terms of number of lines of code, the code itself is simple: it is not multi-thread, not multi-process and does not use any "external" library, with the exception of colorama, installed via pip.
The program does not require a big amount of memory.
When I run it, and monitor via "htop", it shows one (out of the eight) cpu is used 100% by the script, and around 1.16GB (out of 62.8GB) of memory are used (this number remains more or less steady).
After a while (10 to 20 minutes) of running the script, my ubuntu dell desktop running ubuntu 16.04 systematically freezes. I can move the mouse, but clicks do not work, the keyboard is unresponsive, and running programs (e.g. htop) freeze. I can only (hard) reboot. Note that the last frame displayed by htop does not show anything unexpected (e.g. no higher usage of memory).
I never experience such freezes when not running the python program.
I do nothing special in parallel of running the script, aside of browsing with firefox or dealing with mails using thunderbird (i.e. nothing that would use cpu or ram in a significative fashion).
I have printed traces in my python code: it never crashes at the same state.
I also print kernel logs in another terminal: nothing special is printed at the time of the freeze.
I do not use any IDE, and run the script directly from a terminal.
Searching for similar issue, it seems that they are usually related to overusage of memory, which does not seem to be my case.
I have no idea how to investigate this issue.

Python Shell memory management for always running script

I have written a python (python version is 3) script that runs 24/7. The way I run my script in my Windows machine is the following. I right click on the .py file, then click on "Edit with IDLE" and then "Run". The script has no issue but, due to the many instructions printed in the python shell (I use a logger), after a couple of days this python shell gets very heavy. My newbie question is the following. Is there to limit the number of rows temporarily saved in the python shell to a specific number? Or perhaps somebody has a better suggestion to run this constantly running script that prints a lot of the script steps in the shell? Please, notice how I'm not asking how to run a script 24/7, it's my understanding the best way to do it is though a VPS. My problem is that the data saved in the displayed python shell gets bigger and bigger every day, so I only wonder how to limit the data temporarily displayed/saved in it. Thanks

Categories