Memory overflow when saving Matplotlib plots in a loop - python

I am using an iterative loop to plot soame data using Matplotlib. When the code has saved around 768 plots, it throws the following exception.
RuntimeError: Could not allocate memory for image
My computer has around 3.5 GB RAM.
Is there any method to free the memory in parallel so that the memory does not get exhausted?

Are you remembering to close your figures when you are done with them? e.g.:
import matplotlib.pyplot as plt
#generate figure here
#...
plt.close(fig) #release resources associated with fig

As a slightly different answer, remember that you can re-use figures. Something like:
fig = plt.figure()
ax = plt.gca()
im = ax.imshow(data_list[0],...)
for new_data in data_list:
im.set_cdata(new_data)
fig.savefig(..)
Which will make your code run much faster as it will not need to set up and tear down the figure 700+ times.

Related

clear memory used by mplfinance

I'm using mplfinance module to plot candlesticks. The problem is mplfinance uses too much memory when it generates plots. I have tried the instructions mentioned in free up the memory used by matplotlib but nothing changed and my code is still fulling up my computer memory.Here is my code:
fig, axlist = mpf.plot(hloc,hlines=hlines,
ylabel='Price(USDT)',type='candle',
style='binance',title=my_title,closefig=True,returnfig=True)
any suggestion is highly appreciated.
It would be helpful to see the rest of your code, to see how you are displaying plots and how many. That said, given the above code, when you are done with each plot you might try:
for ax in axlist:
del ax
del fig
This will save memory, but at the expense of some time (which will anyway not be noticeable unless your are making thousands of plots).
If you are saving your plots to image files (instead of displaying to the screen) then matplotlib.use("Agg") may help as well.

How can I avoid memory leaks with real-time plotting (matplotlib) in Jupyter Notebook?

I'm training a large DQN in Jupyter notebook. I'm having some trouble finding a way to update this plot in real-time without causing a memory leak. I currently have a dirty implementation that uses ~ 1GB of RAM per episode (14,000 steps). By the time I've gotten through 7 episodes like the screenshot below, I'm about halfway out of memory on my system.
From what I've read in other posts, attempting to plot in the same thread will cause a memory leak regardless of gc.collect() or del fig, fig.clear(), etc. How can I update this plot within a loop without causing a memory leak?
I found a similar question here, but couldn't quite figure out how to apply it in my case with multiple figures and data that is updated dynamically.
clear_output(wait=True)
plt.close()
plt.ion()
fig, axs = plt.subplots(2, figsize=(10,7))
fig.tight_layout()
color = [int((item + 1) * 255 / 2) for item in p_reward_history]
axs[0].scatter(tindex, p_reward_history[-plot_len:], c=color[-plot_len:], cmap='RdYlGn', linewidth=3)
axs[0].set_title('P&L Individual Transactions')
axs[0].plot(zero_line, color="black", linewidth=3)
axs[0].set_facecolor('#2c303c')
axs[1].set_title('P&L Running Total')
axs[1].set_facecolor('#2c303c')
axs[1].plot(running_rewards_history, color="#94c273", linewidth=3)
The variables that are dynamic are running_reward_history and p_reward_history. These are both lists that get new values appended to each loop.
Current implementation looks like this:
I prefer to work in Jupyter notebook, but if I need to train in a regular shell in order to update asynchronously, that is okay with me.

matplotlib loads memory and does not show plot

I want to plot a large (>100k rows) file in matplotlib. When I do it for the first time, I get the result I need. However, if I restart and rerun kernel, plt.show() infinitely loads memory and does not show the graph.
Tried restarting Jupyter Notebook and Anaconda, the problem remains.
import pandas as pd
import matplotlib.pyplot as plt
dataset = f'data/data_name.csv'
df = pd.read_csv(dataset)
pd.options.display.float_format = '{:.2f}'.format
df.set_index('time', inplace=True)
plt.figure(figure=18,6))
plt.plot(df['some_column']
plt.show()
From this moment, an instance of Python appears in processes, and it starts to consume memory with no end.
Thank you in advance.
It appears the memory on your machine is being overwhelmed by the size of the plot and is crashing your kernel. I'd suggest plotting fewer datapoints using df.sample(n=10**4, random_state=1). If your data is massive and nicely distributed, taking a sample should reduce the memory and allow for more rapid plotting.

Matplotlib plotting is slow

I am relatively new to python / matplotlib and have to plot huge numpy arrays (more than 6 mio entries). The problem is that the 6 plots I have take more than 3 GB of RAM and take very long to load.
I researched a bit and found out that I can speed matplotlib up by not loading the axes and title every time.
So now the code looks like this but it's still quite slow.
Should I use another module instead of matplotlib?
How could I speed up the process?
Thanks a lot in advance
for key, typus in self.sensorObjects.items():
fig, ax = plt.subplots()
ax.set_title(key)
ax.set_xlabel('t (ms)')
ax.set_ylabel(self.sensorObjects[key][0].unit)
for sensor in typus:
data = sensor.physical_data
ax.plot(data)
fig.canvas.update()

how to stop pyplot from popping up figures

I write a number of plots to a pdf with a loop like the following. It works but there are two very annoying issues,
1) When the loop runs, i see a lot of windows ('Figure 1') popped up. I think the command plt.close(fig) does not work as intended. This' really annoying because I might be doing something else when it runs and those pop-ups block my view to the other tasks.
2) Probably related to 1), memory usage goes up dramatically. In my real script, plotting something like 50 pages of graphs eats up > 32 Gb of ram. How could that be?!
with PdfPages('Manyplots.pdf') as pdf:
for j in xrange(100):
fig = plt.figure(1, figsize=(5,5))
for fr in xrange(9):
pp = fig.add_subplot(3,3,fr+1)
pp.imshow(x, cmap=plt.cm.gray)
pdf.savefig()
plt.close(fig)
My questions are
1) any way to close a figure after the plot is done?
2) better still, how to suppress blank Figure pop-up since it should really be writing to an external file in the background,
3) any better way to save a series of plots to multiple pages of PDF?
Found the cause of the problem. My main script includes an import of someone's utility script, which imports pyplot and has an extra line,
import matplotlib.pyplot as plt
plt.ion()
When plt.ion is commented out, the popups are gone.

Categories