plt.quiver() plotting dots instead of vectors in some places - python

I'm currently analyzing some data by creating a vector plot. All the vectors have length 1 unit. Most show up fine, but certain vectors such as:
fig = plt.figure()
plt.axes(xlim=(-24, 24), ylim=(0, 150))
plt.quiver([-19.1038], [96.5851], [-19.1001+19.1038], [97.5832-96.5851],angles='xy', scale_units='xy', scale=1, headwidth=1, headlength=10, minshaft=5)
plt.show()
show up as a point. (Please note that I am not drawing my vectors individually like this; I only drew this particular one to try to debug my code.) This appears to only be occurring for nearly vertical vectors. I've also noticed that this issue is resolved if I "zoom in" on the vector (i.e. change the axis scaling). However, I cannot do that as many other vectors in my plot will be outside of the domain/range. Is there another way to fix this?
The problem is demonstrated in the below figure:

There are two components to your problem, and both have to do with how you chose to represent your data.
The default behaviour of quiver is to auto-scale your vectors to a reasonable size for a pretty result. The documentation says as much:
The default settings auto-scales the length of the arrows to a reasonable size. To change this behavior see the scale and scale_units kwargs.
And then
scale_units : [ ‘width’ | ‘height’ | ‘dots’ | ‘inches’ | ‘x’ | ‘y’ | ‘xy’ ], None, optional
[...]
If scale_units is ‘x’ then the vector will be 0.5 x-axis units. To plot vectors in the x-y plane, with u and v having the same units as x and y, use angles='xy', scale_units='xy', scale=1.
So in your case, you're telling quiver to plot the arrow in xy data units. Since your arrow is of unit length, it is drawn as a 1-length arrow. Your data limits, on the other hand, are huge: 40 units wide, 150 units tall. On this scale a length-1 arrow is just too small, and matplotlib decides to truncate the arrow and plot a dot instead.
If you zoom in, as you said yourself, the arrow appears. If we remove the parameters that turn your arrow into a toothpick, it turns out that the arrow you plot is perfectly fine if you look close enough (not the axes):
Now, the question is why this behaviour depends on the orientation of your vectors. The reason for this behaviour is that the x and y limits are different in your plot, so a unit-length horizontal line and a unit-length vertical line contain a different number of pixels (since your data is scaled in xy units). This implies that while horizontal arrows are long enough to be represented accurately, vertical ones become so short that matplotlib decides to truncate them to dots, which shouldn't be too obvious with the default arrow format, but it is pretty bad with your custom arrows. Your use case is such that the rendering cut-off used by matplotlib happens to fall between the length of your horizontal vectors and the length of your vertical ones.
You have two straightforward choices. One is to increase the scaling for your arrows to the point where every orientation is represented accurately. This would probably be the solution to Y in a small XY problem here. What you should really do, is represent your data accurately. Since you're plotting your vector field in xy data units, you presumably want your x and y axes to have equal sizes, and you want your arrows to have visually unit length (i.e. a length that's independent from their orientation).
So I suggest that you force your plot to have equal units on both axes, at the cost of ending up with a rectangular figure:
fig = plt.figure()
ax = fig.add_subplot(111)
ax.axis('scaled') # <-- key addition
ax.axis([-24, 24, 0, 150])
ax.quiver([-19.1038], [96.5851], [-19.1001+19.1038], [97.5832-96.5851],
angles='xy', scale_units='xy', scale=1, headwidth=1,
headlength=10, minshaft=5)
plt.show()
Trust me: there's a tiny arrow in there. The main point is that this way either all of your vectors will be dots (if you're zoomed out too much), or neither of them will. Then you have a sane situation, and can choose the overall scaling of your vectors accordingly.

Related

Contour Plot of Binary Data (0 or 1)

I have x values, y values, and z values. The z values are either 0 or 1, essentially indicating whether an (x,y) pair is a threat (1) or not a threat (0).
I have been trying to plot a 2D contour plot using the matplotlib contourf. This seems to have been interpolating between my z values, which I don't want. So, I did a bit of searching and found that I could use pcolormesh to better plot binary data. However, I am still having some issues.
First, the colorbar of my pcolormesh plot doesn't show two distinct colors (white or red). Instead, it shows a full spectrum from white to red. See the attached plot for what I mean. How do I change this so that the colorbar only shows two colors, for 0 and 1? Second, is there a way to draw a grid of squares into the contour plot so that it is more clear for which x and y intervals the 0s and 1s are occurring. Third, my code calls for minorticks. However, these do not show up in the plot. Why?
The code which I use is shown here. The vels and ms for x and y can really be anything, and the threat_bin is just the corresponding 0 or 1 values for all the (vets,ms) pairs:
fig=plt.figure(figsize=(6,5))
ax2=fig.add_subplot(111)
from matplotlib import cm
XX,YY=np.meshgrid(vels, ms)
cp=ax2.pcolormesh(XX/1000.0,YY,threat_bin, cmap=cm.Reds)
ax2.minorticks_on()
ax2.set_ylabel('Initial Meteoroid Mass (kg)')
ax2.set_xlabel('Initial Meteoroid Velocity (km/s)')
ax2.set_yscale('log')
fig.colorbar(cp, ticks=[0,1], label='Threat Binary')
plt.show()
Please be simple with your recommendations, and let me know the code I should include or change with respect to what I have at the moment.

Creating a pseudo color plot with a linear and nonlinear axis and computing values based on the center of grid values

I have the equation: z(x,y)=1+x^(2/3)y^(-3/4)
I would like to calculate values of z for x=[0,100] and y=[10^1,10^4]. I will do this for 100 points in each axis direction. My grid, then, will be 100x100 points. In the x-direction I want the points spaced linearly. In the y-direction I want the points space logarithmically.
Were I to need these values I could easily go through the following:
x=np.linspace(0,100,100)
y=np.logspace(1,4,100)
z=np.zeros( (len(x), len(y)) )
for i in range(len(x)):
for j in range(len(y)):
z[i,j]=1+x[i]**(2/3)*y[j]**(-3/4)
The problem for me comes with visualizing these results. I know that I would need to create a grid of points. I feel my options are to create a meshgrid with the values and then use pcolor.
My issue here is that the values at the center of the block do not coincide with the calculated values. In the x-direction I could fix this by shifting the x-vector by half of dx (the step between successive values). I'm not so sure how I would do this for the y-axis. Furthermore, If I wanted to compute values for each of the y-direction values, including the end points, they would not all show up.
In the final visualization I would like to have the y-axis as a log scale and the x axis as a linear scale. I would also like the tick marks to fall in the center of the cells, correlating with the correct value. Can someone point me to the correct plotting functions for this. I have to resolve the issue using pcolor or pcolormesh.
Should you require more details, please let me know.
In current matplotlib, you can use pcolormesh with shading='nearest', and it will center the blocks with the values:
import matplotlib.pyplot as plt
y_plot = np.log10(y)
z[5, 5] = 0 # to make it more evident
plt.pcolormesh(x, y_plot, z, shading="nearest")
plt.colorbar()
ax = plt.gca()
ax.set_xticks(x)
ax.set_yticks(y_plot)
plt.axvline(x[5])
plt.axhline(y_plot[5])
Output:

How to plot a circle for each point scatter plot while each has particular radius size

I have a pandas frame with distance matrix, I use PCA to do the dim reduction. The the dataframe of this distance matrix has label for each point, and size.
How can I make each scattered point become a circle with a size dependent on the size from the dataframe
````
pca = PCA(n_components=2)
pca.fit(dist)
mds5 = pca.components_
fig = go.Figure()
fig.add_scatter(x = mds5[0],
y = mds5[1],
mode = 'markers+text',
marker= dict(size = 8,
color= 'blue'
),
text= dist.columns.values,
textposition='top right')
````
I need to have the scatter plot looks something like this example, however, when I add the size for each point in related answers, I cant get the circles to overlap, and when they do, I can zoom in, then they dont overlap anymore
sounds strange, but I need to create a logic, that if two circles overlap, the one with smaller radius will dissapear, so:
how to keep the circle size the same, regardless of the zoom
how to create a logic in python to cancel the smaller overlapping circle?
I'm still not sure which PCA parameter you want to be reflected in the circle size, but: either you want to
use a scatter plot (i.e. ax.scatter()) whose size= is reflecting your chosen PCA parameter; this size will (and should not) rescale when you rescale the figure; it is also not given in (x,y)-units
use multiple plt.Circle((x,y), radius=radius, **kwargs) patches, whose radii are given in (x,y)-units; the point overlap is then consistent on rescale, but this will likely cause deformed points
The following animation will emphasise the issue at hand:
I suppose you want the plt.Circle-based solution, as it keeps the distance static, and then you need to "manually" calculate beforehand whether two points overlap and delete them "manually". You should be able to do this automatically via a comparison between point size (i.e. radius, your PCA parameter) and the euclidian distance between your data points (i.e. np.sqrt(dx**2 + dy**2)).
To use Circles, you could e.g. define a shorthand function:
def my_circle_scatter(ax, x_array, y_array, radius=0.5, **kwargs):
for x, y in zip(x_array, y_array):
circle = plt.Circle((x,y), radius=radius, **kwargs)
ax.add_patch(circle)
return True
and then call it with optional parameters (i.e. the x- and y-coordinates, colors, and so on):
my_circle_scatter(ax, xs, ys, radius=0.2, alpha=.5, color='b')
Where I've used fig,ax=plt.subplots() to create the figure and subplot individually.

Changing aspect ratio of subplots in matplotlib

I have created a series of simple greyscale images which I have plotted in a grid (unfortunately, can't upload an image because I don't have a high enough reputation :( ).
The pseudo-code is
# Define matplotlib PyPlot object
nrow = 8
ncol = 12
fig, axes = plt.subplots(nrow, ncol, subplot_kw={'xticks': [], 'yticks': []})
fig.subplots_adjust(hspace=0.05, wspace=0.05)
# Sample the fine scale model at random well locations
for ax in axes.flat:
plot_data = # some Python code here to create 2D grey scale array...
# ... create sub-plot
img = ax.imshow(plot_data, interpolation='none')
img.set_cmap('gray')
# Display the plot
plt.show()
I want to change the aspect ratio so that the plots are squashed vertically and stretched horizontally. I have tried using ax.set_aspect and passing 'aspect' as a subplot_kw argument but to no avail. I also switched 'autoscale' off but I can then only see a handful of pixels. All suggestions welcome!
Thanks in advance!!
#JoeKington - thank you! That was a great reply!! Still trying to get my head around it all. Thanks also to the other posters for their suggestions. So, the original plot looked like this: http://imgur.com/Wi6v4cs
When I set' aspect='auto'' the plot looks like this: http://imgur.com/eRBO6MZ
which is a big improvement. All I need to do now is adjust the subplot size so that sub-plots are plotted in a portrait aspect ratio of eg 2:1, but with the plot filling the entire sub-plot. I guess 'colspan' would do this?
The Short Answer
You're probably wanting to call:
ax.imshow(..., aspect='auto')
imshow will set the aspect ratio of the axes to 1 when it is called, by default. This will override any aspect you specify when you create the axes.
However, this is a common source of confusion in matplotlib. Let me back up and explain what's going on in detail.
Matplotlib's Layout Model
aspect in matplotlib refers to the ratio of the xscale and yscale in data coordinates. It doesn't directly control the ratio of the width and height of the axes.
There are three things that control the size and shape of the "outside box" of a matplotlib axes:
The size/shape of the Figure (shown in red in figures below)
The specified extent of the Axes in figure coordinates (e.g. the subplot location, shown in green in figures below)
The mechanism that the Axes uses to accommodate a fixed aspect ratio (the adjustable parameter).
Axes are always placed in figure coordinates in other words, their shape/size is always a ratio of the figure's shape/size. (Note: Some things such as axes_grid will change this at draw time to get around this limitation.)
However, the extent the axes is given (either from its subplot location or explicitly set extent) isn't necessarily the size it will take up. Depending on the aspect and adjustable parameters, the Axes will shrink inside of its given extent.
To understand how everything interacts, let's plot a circle in lots of different cases.
No Fixed Aspect
In the basic case (no fixed aspect ratio set for the axes), the axes will fill up the entire space allocated to it in figure coordinates (shown by the green box).
The x and y scales (as set by aspect) will be free to change independently, distorting the circle:
When we resize the figure (interactively or at figure creation), the axes will "squish" with it:
Fixed Aspect Ratio, adjustable='box'
However, if the aspect ratio of the plot is set (imshow will force the aspect ratio to 1, by default), the Axes will adjust the size of the outside of the axes to keep the x and y data ratios at the specified aspect.
A key point to understand here, though, is that the aspect of the plot is the aspect of the x and y data scales. It's not the aspect of the width and height of the plot. Therefore, if the aspect is 1, the circle will always be a circle.
As an example, let's say we had done something like:
fig, ax = plt.subplots()
# Plot circle, etc, then:
ax.set(xlim=[0, 10], ylim=[0, 20], aspect=1)
By default, adjustable will be "box". Let's see what happens:
The maximum space the Axes can take up is shown by the green box. However, it has to maintain the same x and y scales. There are two ways this could be accomplished: Change the x and y limits or change the shape/size of the Axes bounding box. Because the adjustable parameter of the Axes is set to the default "box", the Axes shrinks inside of its maximum space.
And as we resize the figure, it will keep shrinking, but maintain the x and y scales by making the Axes use up less of the maximum space allocated to the axes (green box):
Two quick side-notes:
If you're using shared axes, and want to have adjustable="box", use adjustable="box-forced" instead.
If you'd like to control where the axes is positioned inside of the "green box" set the anchor of the axes. E.g. ax.set_anchor('NE') to have it remain "pinned" to the upper right corner of the "green box" as it adjusts its size to maintain the aspect ratio.
Fixed Aspect, adjustable="datalim"
The other main option for adjustable is "datalim".
In this case, matplotlib will keep the x and y scales in data space by changing one of the axes limits. The Axes will fill up the entire space allocated to it. However, if you manually set the x or y limits, they may be overridden to allow the axes to both fill up the full space allocated to it and keep the x/y scale ratio to the specified aspect.
In this case, the x limits were set to 0-10 and the y-limits to 0-20, with aspect=1, adjustable='datalim'. Note that the y-limit was not honored:
And as we resize the figure, the aspect ratio says the same, but the data limits change (in this case, the x-limit is not honored).
On a side note, the code to generate all of the above figures is at: https://gist.github.com/joferkington/4fe0d9164b5e4fe1e247
What does this have to do with imshow?
When imshow is called, it calls ax.set_aspect(1.0), by default. Because adjustable="box" by default, any plot with imshow will behave like the 3rd/4th images above.
For example:
However, if we specify imshow(..., aspect='auto'), the aspect ratio of the plot won't be overridden, and the image will "squish" to take up the full space allocated to the Axes:
On the other hand, if you wanted the pixels to remain "square" (note: they may not be square depending on what's specified by the extent kwarg), you can leave out the aspect='auto' and set the adjustable parameter of the axes to "datalim" instead.
E.g.
ax.imshow(data, cmap='gist_earth', interpolation='none')
ax.set(adjustable="datalim")
Axes Shape is Controlled by Figure Shape
The final part to remember is that the axes shape/size is defined as a percentage of the figure's shape/size.
Therefore, if you want to preserve the aspect ratio of the axes and have a fixed spacing between adjacent subplots, you'll need to define the shape of the figure to match. plt.figaspect is extremely handy for this. It simply generates a tuple of width, height based on a specified aspect ratio or a 2D array (it will take the aspect ratio from the array's shape, not contents).
For your example of a grid of subplots, each with a constant 2x1 aspect ratio, you might consider something like the following (note that I'm not using aspect="auto" here, as we want the pixels in the images to remain square):
import numpy as np
import matplotlib.pyplot as plt
nrows, ncols = 8, 12
dx, dy = 1, 2
figsize = plt.figaspect(float(dy * nrows) / float(dx * ncols))
fig, axes = plt.subplots(nrows, ncols, figsize=figsize)
for ax in axes.flat:
data = np.random.random((10*dy, 10*dx))
ax.imshow(data, interpolation='none', cmap='gray')
ax.set(xticks=[], yticks=[])
pad = 0.05 # Padding around the edge of the figure
xpad, ypad = dx * pad, dy * pad
fig.subplots_adjust(left=xpad, right=1-xpad, top=1-ypad, bottom=ypad)
plt.show()

Arrow pointing to a point on a curve

I am trying to plot arrows pointing at a point on a curve in python using matplotlib.
On this line i need to point vertical arrows at specific points.
This is for indicating forces acting on a beam, so their direction is very important. Where the curve is the beam and the arrow is the force.
I know the coordinate of said point, exactly, but it is of cause changing with the input.
This input should also dictate whether the arrow points upwards or downwards from the line. (negative and positive forces applied).
I have tried endlessly with plt.arrow, but because the scale changes drastically and so does the quadrant in which the arrow has to be. So it might have to start at y < 0 and end in a point where y > 0.
The problem is that the arrowhead length then points the wrong way like this --<. instead of -->.
So before I go bald because of this, I would like to know if there is an easy way to apply a vertical arrow (could be infinite in the opposite direction for all i care) pointing to a point on a curve, of which I can control whether it point upwards to the curve, or downwards to the curve.
I'm not sure I completely follow you, but my approach would be to use annotate rather than arrow (just leave the text field blank). You can specify one end of the arrow in data coordinates and the other in offset pixels: but you do have to map your forces (indicating the length of the arrows) to number of pixels. For example:
import matplotlib.pyplot as plt
import numpy as np
# Trial function for adding vertical arrows to
def f(x):
return np.sin(2*x)
x = np.linspace(0,10,1000)
y = f(x)
fig = plt.figure()
ax = fig.add_subplot(111)
ax.plot(x,y, 'k', lw=2)
ax.set_ylim(-3,3)
def add_force(F, x1):
"""Add a vertical force arrow F pixels long at x1 (in data coordinates)."""
ax.annotate('', xy=(x1, f(x1)), xytext=(0, F), textcoords='offset points',
arrowprops=dict(arrowstyle='<|-', color='r'))
add_force(60, 4.5)
add_force(-45, 6.5)
plt.show()
The inverted arrowhead is due to a negative sign of the head_length variable. Probably you are scaling it using a negative value. Using head_length= abs(value)*somethingelse should take care of your problem.

Categories