I have an array in python, using matplotlib, with floats ranging between 0 and 1.
I am displaying this array with imshow, I am trying to create a custom cmap, which is identical to Greens, however when a cell becomes 0 I would like to be able to map that value to red, and leave the rest of he spectrum unchanged.
If anyone more familiar with matplotlib would be able to help me I would greatly appreciate it!
For instance how would I edit this script so that the zero value in the matrix showed as red?
import numpy as np
from matplotlib import pyplot as plt
import matplotlib
x = np.array([[0,1,2],[3,4,5],[6,7,8]])
fig = plt.figure()
cmap_custom = matplotlib.cm.Greens
plt.imshow( x, interpolation='nearest' ,cmap = cmap_custom)
plt.colorbar()
plt.show()
The colormaps in matplotlib allow you to set special colors for values that are outside of the defined range. In your case specify the color for values below the defined range with cmap_custom.set_under('r').
Then you also need to specify the lower end of the range: vmin=0.01 (just some value > 0).
Finally create the colorbar with plt.colorbar(extend='min').
import numpy as np
from matplotlib import pyplot as plt
import matplotlib
x = np.array([[0,1,2],[3,4,5],[6,7,8]])
fig = plt.figure()
cmap_custom = matplotlib.cm.Greens
cmap_custom.set_under('r')
plt.imshow( x, interpolation='nearest' ,cmap = cmap_custom, vmin=0.01)
plt.colorbar(extend='min')
plt.show()
Related
I'm trying to plot 3D data in 2D using orthographic projection. Here is partially what I'm looking for:
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
fig = plt.figure(figsize=(10,10),facecolor='white')
axs = [fig.add_subplot(223)]
axs.append(fig.add_subplot(224))#,sharey=axs[0]))
axs.append(fig.add_subplot(221))#,sharex=axs[0]))
rng = np.random.default_rng(12345)
values = rng.random((100,3))-.5
values[:,1] = 1.6*values[:,1]
values[:,2] = .5*values[:,2]
for ax,axis in zip(axs,['y','x','z']):
axis1,axis2={'x':(1,2),'y':(0,2),'z':(0,1)}[axis]
ax.add_patch(plt.Circle([0,0], radius=.2, color='pink',zorder=-20))
ax.scatter(values[:,axis1],values[:,axis2])
axs[0].set_xlabel('x')
axs[2].set_ylabel('y')
axs[1].set_xlabel('y')
axs[0].set_ylabel('z')
fig.subplots_adjust(.08,.06,.99,.99,0,0)
plt.show()
There are some issues with this plot and the fixes I tried: I would need 'equal' aspect so that the circles are actually circle. I would also need the circles to be of the same size in each subplot. Finally, I would like the space to be optimized (i.e. with as little white space inside and between the subplots as possible).
I have tried sharing the axis between the subplots, then doing .axis('scaled') or .set_aspect('equal','box',share=True) for each axes, but the axis end up not being properly shared, and the circle in each subplot end up of different sizes. And while it crops the subplots to the data, it leaves a lot of space between the subplots. .axis('equal') or .set_aspect('equal','datalim',share=True) without axis shared leaves white space inside the subplots, and with shared axis, it leaves out some data.
Any way to make it work? And it would be perfect if it can work on matplotlib 3.4.3.
You can use a common xlim, ylim for your subplots and set your equal ratio with ax.set_aspect(aspect='equal', adjustable='datalim'):
See full code below:
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
fig = plt.figure(figsize=(10,10),facecolor='white')
axs = [fig.add_subplot(223)]
axs.append(fig.add_subplot(224))#,sharey=axs[0]))
axs.append(fig.add_subplot(221))#,sharex=axs[0]))
rng = np.random.default_rng(12345)
values = rng.random((100,3))-.5
values[:,1] = 1.6*values[:,1]
values[:,2] = .5*values[:,2]
for ax,axis in zip(axs,['y','x','z']):
axis1,axis2={'x':(1,2),'y':(0,2),'z':(0,1)}[axis]
ax.add_patch(plt.Circle([0,0], radius=.2, color='pink',zorder=-20))
ax.scatter(values[:,axis1],values[:,axis2])
ax.set_xlim([np.amin(values),np.amax(values)])
ax.set_ylim([np.amin(values),np.amax(values)])
ax.set_aspect('equal', adjustable='datalim')
axs[0].set_xlabel('x')
axs[2].set_ylabel('y')
axs[1].set_xlabel('y')
axs[0].set_ylabel('z')
fig.subplots_adjust(.08,.06,.99,.99,0,0)
plt.show()
The output gives:
I made it work using gridspec (I changed scatter for plot to visually make sure no data gets left out). It requires some tweaking of the figsize to really minimize the white space within the axes. Thank you to #jylls for the intermediate solution.
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.gridspec import GridSpec
%matplotlib inline
rng = np.random.default_rng(12345)
values = rng.random((100,3))-.5
values[:,1] = 1.6*values[:,1]
values[:,2] = .5*values[:,2]
fig = plt.figure(figsize=(10,8),facecolor='white')
ranges = np.ptp(values,axis=0)
gs = GridSpec(2, 2, None,.08,.06,.99,.99,0,0, width_ratios=[ranges[0], ranges[1]], height_ratios=[ranges[1], ranges[2]])
axs = [fig.add_subplot(gs[2])]
axs.append(fig.add_subplot(gs[3]))#,sharey=axs[0]))
axs.append(fig.add_subplot(gs[0]))#,sharex=axs[0]))
for ax,axis in zip(axs,['y','x','z']):
axis1,axis2={'x':(1,2),'y':(0,2),'z':(0,1)}[axis]
ax.add_patch(plt.Circle([0,0], radius=.2, color='pink',zorder=-20))
ax.plot(values[:,axis1],values[:,axis2])
ax.set_aspect('equal', adjustable='datalim')
axs[0].set_xlabel('x')
axs[2].set_ylabel('y')
axs[1].set_xlabel('y')
axs[0].set_ylabel('z')
plt.show()
I frequently find myself working in log units for my plots, for example taking np.log10(x) of data before binning it or creating contour plots. The problem is, when I then want to make the plots presentable, the axes are in ugly log units, and the tick marks are evenly spaced.
If I let matplotlib do all the conversions, i.e. by setting ax.set_xaxis('log') then I get very nice looking axes, however I can't do that to my data since it is e.g. already binned in log units. I could manually change the tick labels, but that wouldn't make the tick spacing logarithmic. I suppose I could also go and manually specify the position of every minor tick such it had log spacing, but is that the only way to achieve this? That is a bit tedious so it would be nice if there is a better way.
For concreteness, here is a plot:
I want to have the tick labels as 10^x and 10^y (so '1' is '10', 2 is '100' etc.), and I want the minor ticks to be drawn as ax.set_xaxis('log') would draw them.
Edit: For further concreteness, suppose the plot is generated from an image, like this:
import matplotlib.pyplot as plt
import scipy.misc
img = scipy.misc.face()
x_range = [-5,3] # log10 units
y_range = [-55, -45] # log10 units
p = plt.imshow(img,extent=x_range+y_range)
plt.show()
and all we want to do is change the axes appearance as I have described.
Edit 2: Ok, ImportanceOfBeingErnest's answer is very clever but it is a bit more specific to images than I wanted. I have another example, of binned data this time. Perhaps their technique still works on this, though it is not clear to me if that is the case.
import numpy as np
import pandas as pd
import datashader as ds
from matplotlib import pyplot as plt
import scipy.stats as sps
v1 = sps.lognorm(loc=0, scale=3, s=0.8)
v2 = sps.lognorm(loc=0, scale=1, s=0.8)
x = np.log10(v1.rvs(100000))
y = np.log10(v2.rvs(100000))
x_range=[np.min(x),np.max(x)]
y_range=[np.min(y),np.max(y)]
df = pd.DataFrame.from_dict({"x": x, "y": y})
#------ Aggregate the data ------
cvs = ds.Canvas(plot_width=30, plot_height=30, x_range=x_range, y_range=y_range)
agg = cvs.points(df, 'x', 'y')
# Create contour plot
fig = plt.figure()
ax = fig.add_subplot(111)
ax.contourf(agg, extent=x_range+y_range)
ax.set_xlabel("x")
ax.set_ylabel("y")
plt.show()
The general answer to this question is probably given in this post:
Can I mimic a log scale of an axis in matplotlib without transforming the associated data?
However here an easy option might be to scale the content of the axes and then set the axes to a log scale.
A. image
You may plot your image on a logarithmic scale but make all pixels the same size in log units. Unfortunately imshow does not allow for such kind of image (any more), but one may use pcolormesh for that purpose.
import numpy as np
import matplotlib.pyplot as plt
import scipy.misc
img = scipy.misc.face()
extx = [-5,3] # log10 units
exty = [-45, -55] # log10 units
x = np.logspace(extx[0],extx[-1],img.shape[1]+1)
y = np.logspace(exty[0],exty[-1],img.shape[0]+1)
X,Y = np.meshgrid(x,y)
c = img.reshape((img.shape[0]*img.shape[1],img.shape[2]))/255.0
m = plt.pcolormesh(X,Y,X[:-1,:-1], color=c, linewidth=0)
m.set_array(None)
plt.gca().set_xscale("log")
plt.gca().set_yscale("log")
plt.show()
B. contour
The same concept can be used for a contour plot.
import numpy as np
from matplotlib import pyplot as plt
x = np.linspace(-1.1,1.9)
y = np.linspace(-1.4,1.55)
X,Y = np.meshgrid(x,y)
agg = np.exp(-(X**2+Y**2)*2)
fig, ax = plt.subplots()
plt.gca().set_xscale("log")
plt.gca().set_yscale("log")
exp = lambda x: 10.**(np.array(x))
cf = ax.contourf(exp(X), exp(Y),agg, extent=exp([x.min(),x.max(),y.min(),y.max()]))
ax.set_xlabel("x")
ax.set_ylabel("y")
plt.show()
I am trying to use ax.scatter to plot a 3D scattering plot. I've read the data from a fits file and stored data from three column into x,y,z. And I have made sure x,y,z data are the same size. z has been normolized between 0 and 1.
import numpy as np
import matplotlib
from matplotlib import pylab,mlab,pyplot,cm
plt = pyplot
import pyfits as pf
from mpl_toolkits.mplot3d import Axes3D
import fitsio
data = fitsio.read("xxx.fits")
x=data["x"]
y=data["y"]
z=data["z"]
z = (z-np.nanmin(z)) /(np.nanmax(z) - np.nanmin(z))
Cen3D = plt.figure()
ax = Cen3D.add_subplot(111, projection='3d')
cmap=cm.ScalarMappable(norm=z, cmap=plt.get_cmap('hot'))
ax.scatter(x,y,z,zdir=u'z',cmap=cmap)
ax.set_xlabel('x')
ax.set_ylabel('y')
ax.set_zlabel('z')
plt.show()
What I am trying to achieve is use color to indicate the of size of z. Like higher value of z will get darker color. But I am keep getting a plot without the colormap I want, they are all the same default blue color. What did I do wrong? Thanks.
You can use the c keyword in the scatter command, to tell it how to color the points.
You don't need to set zdir, as that is for when you are plotting a 2d set
As #Lenford pointed out, you can use cmap='hot' in this case too, since you have already normalized your data.
I've modified your example to use some random data rather than your fits file.
import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
x = np.random.rand(100)
y = np.random.rand(100)
z = np.random.rand(100)
z = (z-np.nanmin(z)) /(np.nanmax(z) - np.nanmin(z))
Cen3D = plt.figure()
ax = Cen3D.add_subplot(111, projection='3d')
ax.scatter(x,y,z,cmap='hot',c=z)
ax.set_xlabel('x')
ax.set_ylabel('y')
ax.set_zlabel('z')
plt.show()
As per the pyplot.scatter documentation, the points specified to be plotted must be in the form of an array of floats for cmap to apply, otherwise the default colour (in this case, jet) will continue to apply.
As an aside, simply stating cmap='hot' will work for this code, as the colour map hot is a registered colour map in matplotlib.
I am trying to replace the colorbar given by "hp.mollview" with a custom one. In particular I am interested in:
Rotating the colorbar by 90 degrees (i.e. replacing the horizontal by a vertical one)
Using two labels (left and right of the colorbar)
Setting custom ticks
Indicating that the range is set (via the "max" parameter) by setting "cmap.set_over".
Minimal amount of code:
import numpy as np
import healpy as hp
m = np.arange(hp.nside2npix(32))
hp.mollview(m)
Any help?
I'll expand my comment here:
import numpy as np
import healpy as hp
import matplotlib.pyplot as plt
m = np.arange(hp.nside2npix(32))
hp.mollview(m, cbar=None)
fig = plt.gcf()
ax = plt.gca()
image = ax.get_images()[0]
cmap = fig.colorbar(image, ax=ax)
Then you can customize the colorbar with the function arguments.
I am attempting to use matplotlib to plot some figures for a paper I am working on. I have two sets of data in 2D numpy arrays: An ascii hillshade raster which I can happily plot and tweak using:
import matplotlib.pyplot as pp
import numpy as np
hillshade = np.genfromtxt('hs.asc', delimiter=' ', skip_header=6)[:,:-1]
pp.imshow(hillshade, vmin=0, vmax=255)
pp.gray()
pp.show()
Which gives:
And a second ascii raster which delineates properties of a river flowing across the landscape. This data can be plotted in the same manner as above, however values in the array which do not correspond to the river network are assigned a no data value of -9999. The aim is to have the no data values set to be transparent so the river values overlie the hillshade.
This is the river data, ideally every pixel represented here as 0 would be completely transparent.
Having done some research on this it seems I may be able to convert my data into an RGBA array and set the alpha values to only make the unwanted cells transparent. However, the values in the river array are floats and cannot be transformed (as the original values are the whole point of the figure) and I believe the imshow function can only take unsigned integers if using the RGBA format.
Is there any way around this limitation? I had hoped I could simply create a tuple with the pixel value and the alpha value and plot them like that, but this does not seem possible.
I have also had a play with PIL to attempt to create a PNG file of the river data with the no data value transparent, however this seems to automatically scale the pixel values to 0-255, thereby losing the values I need to preserve.
I would welcome any insight anyone has on this problem.
Just mask your "river" array.
e.g.
rivers = np.ma.masked_where(rivers == 0, rivers)
As a quick example of overlaying two plots in this manner:
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.cm as cm
# Generate some data...
gray_data = np.arange(10000).reshape(100, 100)
masked_data = np.random.random((100,100))
masked_data = np.ma.masked_where(masked_data < 0.9, masked_data)
# Overlay the two images
fig, ax = plt.subplots()
ax.imshow(gray_data, cmap=cm.gray)
ax.imshow(masked_data, cmap=cm.jet, interpolation='none')
plt.show()
Also, on a side note, imshow will happily accept floats for its RGBA format. It just expects everything to be in a range between 0 and 1.
An alternate way to do this with out using masked arrays is to set how the color map deals with clipping values below the minimum of clim (shamelessly using Joe Kington's example):
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.cm as cm
# Generate some data...
gray_data = np.arange(10000).reshape(100, 100)
masked_data = np.random.random((100,100))
my_cmap = cm.jet
my_cmap.set_under('k', alpha=0)
# Overlay the two images
fig, ax = plt.subplots()
ax.imshow(gray_data, cmap=cm.gray)
im = ax.imshow(masked_data, cmap=my_cmap,
interpolation='none',
clim=[0.9, 1])
plt.show()
There as also a set_over for clipping off the top and a set_bad for setting how the color map handles 'bad' values in the data.
An advantage of doing it this way is you can change your threshold by just adjusting clim with im.set_clim([bot, top])
Another option is to set all cells which shall remain transparent to np.nan (not sure what's more efficient here, I guess tacaswell's answer based on clim will be the fastet). Example adapting Joe Kington's answer:
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.cm as cm
# Generate some data...
gray_data = np.arange(10000).reshape(100, 100)
masked_data = np.random.random((100,100))
masked_data[np.where(masked_data < 0.9)] = np.nan
# Overlay the two images
fig, ax = plt.subplots()
ax.imshow(gray_data, cmap=cm.gray)
ax.imshow(masked_data, cmap=cm.jet, interpolation='none')
plt.show()
Note that for arrays of dtype=bool you should not follow your IDE's advice to compare masked_data is True for the sake of PEP 8 (E712) but stick with masked_data == True for element-wise comparison, otherwise the masking will fail: