Lines missing in 3d scatterplot - python

I can't find the reason why my plot shows no lines....
fig = plt.figure()
ax = fig.gca(projection='3d')
for i in range(n):
ax.scatter(lys[i][0], lys[i][1], lys[i][2], c='b', marker='o')
ax.plot(x, y, z,'bo', label='Self-avoiding random walk')
ax.legend()
plt.show()

It's because you set the markers in ax.plot to 'bo', which corresponds to blue circle markers only. If you want lines between the markers, you probably want 'b-o', as defined in the docs (check out the 'Format Strings' section under 'Notes'). You need to define the format string as '[marker][line][color]'.
Simple example:
import matplotlib.pyplot as plt
x = y = z = [0, 1, 2]
fig = plt.figure()
ax = fig.gca(projection='3d')
ax.plot(x, y, z, 'b-o')
plt.show()
Returns:

Related

Creating multiple rows of matplotlib x labels

import matplotlib.pyplot as plt
import numpy as np
x = np.arange(-1,5)
y = 6 - np.square(x-1)
fig, ax = plt.subplots()
ax.plot(x, y, 'b')
ax.scatter(x, y, color='m', zorder=10)
ax.set_xlabel('x')
ax.set_ylabel('y')
This creates the following:
This function is increasing for all values of x < 1 and increasing for all values of x > 1. Is there a simple way that I can put the text "Increasing" like an x label but centered below the x ticks of 0 and 1, "Decreasing" like an x label but centered below 3, and move the "x" xlabel lower such that it has a lower vertical position than "Increasing" and "Decreasing"? I'd rather not do this with ax.text() unless I absolutely have to.
Maybe use text? I have tried changing the labels but this seems cumbersome. Unfortunately you have to set the text coordinates "manually". Note that you can use newline in the ticks to move them down.
import matplotlib.pyplot as plt
import numpy as np
x = np.arange(-1,5)
y = 6 - np.square(x-1)
fig, ax = plt.subplots()
ax.text(0.5, -4.6, 'Increasing', ha="center")
ax.text(3, -4.6, 'Decreasing', ha="center")
ax.plot(x, y, 'b')
ax.scatter(x, y, color='m', zorder=10)
ax.set_xlabel('\nx')
ax.set_ylabel('y')
which produces

Matplotlib - create a scatter plot with points that fill the available space

I can create a scatter plot as follows:
fig, ax = plt.subplots()
x1 = [1, 1, 2]
y1 = [1, 2, 1]
x2 = [2]
y2 = [2]
ax.scatter(x1, y1, color="red", s=500)
ax.scatter(x2, y2, color="blue", s=500)
which gives
What I would like is something like the following (apologies for poor paint work):
I am plotting data that is all integer values, so they're all on a grid. I would like to be able to control the size of the scatter marker so that I could have white space around the points, or I could make the points large enough such that there would be no white space around them (as I have done in the above paint image).
Note - ideally the solution will be in pure matplotlib, using the OOP interface as they suggest in the documentation.
import matplotlib.pyplot as plt
import matplotlib as mpl
# X and Y coordinates for red circles
red_xs = [1,2,3,4,1,2,3,4,1,2,1,2]
red_ys = [1,1,1,1,2,2,2,2,3,3,4,4]
# X and Y coordinates for blue circles
blu_xs = [3,4,3,4]
blu_ys = [3,3,4,4]
# Plot with a small markersize
markersize = 5
fig, ax = plt.subplots(figsize=(3,3))
ax.plot(red_xs, red_ys, marker="o", color="r", linestyle="", markersize=markersize)
ax.plot(blu_xs, blu_ys, marker="o", color="b", linestyle="", markersize=markersize)
plt.show()
# Plot with a large markersize
markersize = 50
fig, ax = plt.subplots(figsize=(3,3))
ax.plot(red_xs, red_ys, marker="o", color="r", linestyle="", markersize=markersize)
ax.plot(blu_xs, blu_ys, marker="o", color="b", linestyle="", markersize=markersize)
plt.show()
# Plot with using patches and radius
r = 0.5
fig, ax = plt.subplots(figsize=(3,3))
for x, y in zip(red_xs, red_ys):
ax.add_patch(mpl.patches.Circle((x,y), radius=r, color="r"))
for x, y in zip(blu_xs, blu_ys):
ax.add_patch(mpl.patches.Circle((x,y), radius=r, color="b"))
ax.autoscale()
plt.show()

Matplotlib: categorical plot without strings and inversion of axes

Let's take this snippet of Python:
import matplotlib.pyplot as plt
x = [5,4,3,2,1,0]
x_strings = ['5','4','3','2','1','0']
y = [0,1,2,3,4,5]
plt.figure()
plt.subplot(311)
plt.plot(x, y, marker='o')
plt.subplot(312)
plt.plot(x_strings, y, marker='^', color='red')
plt.subplot(313)
plt.plot(x, y, marker='^', color='red')
plt.gca().invert_xaxis()
plt.show()
Which produces these three subplots:
In the top subplot the x values are automatically sorted increasingly despite their order in the given list. If I want to plot x vs. y exactly in the given order of x, then I have two possibilities:
1) Convert x values to strings and have a categorical plot -- that's the middle subplot.
2) Invert the x-axis -- that's the bottom subplot.
Question: is there any other way to do a sort of categorical plot, but without conversion of numbers into strings and without the inversion of the x-axis?
ADD-ON:
If I use set_xticklabels(list), then for some unclear reason the first element in the list is skipped (no matter if I refer to the x or to the x_strings list), and the resulting plot is also totally strange:
import matplotlib.pyplot as plt
x = [5,4,3,2,1,0]
x_strings = ['5','4','3','2','1','0']
y = [0,1,2,3,4,5]
fig, ax = plt.subplots()
ax.set_xticklabels(x)
ax.plot(x, y, marker='^', color='red')
plt.show()
Both attempted solutions seem possible. Alternatively, you can always mimic categorical plots by plotting integer numbers and setting the ticklabels to your liking.
import matplotlib.pyplot as plt
x = [5,4,3,2,1,0]
y = [0,1,2,3,4,5]
fig, ax = plt.subplots()
ax.plot(range(len(y)), y, marker='^', color='red')
ax.set_xticks(range(len(y)))
ax.set_xticklabels(x)
plt.show()
I have found another way to do it, without being anyhow categorical and without x-axis inversion!
ax = plt.subplot()
ax.set_xlim(x[0],x[-1], auto=True) # this line plays the trick
plt.plot(x, y, marker='^', color='red')

subplots forced to have same axis

Recently I am plotting 2 graphs at a fig. The data is different and share no common contents. But the final visualizations are forced to have the same axis, I don't understand.
Image here
#################################################################################################
fig, (ax1,ax2) = plt.subplots(1,2, sharey = False, sharex = False)
c = list(len(mydf)*'b')
for i in range(len(c)):
if mydf['percent'][i] > 0.05:
c[i] = 'r'
# ax1 = fig.add_subplot(121)
ax1.bar(range(len(mydf['cdf'])), mydf['cdf'], color = c)
ax1.set_xticks(range(len(mydf['cdf'])))
ax1.set_xticklabels(list(mydf['3D_Attri']), rotation=45)
###########################################################################################3
ax2 = fig.add_subplot(122, projection='3d')
xs = mydf['sphere']
ys = mydf['cylinder']
zs = mydf['addition']
ax2.scatter(xs, ys, zs, zdir='z', s=20, c=c, depthshade=True)
ax2.set_xlabel('sphere')
ax2.set_ylabel('cylinder')
ax2.set_zlabel('addition')
plt.show()
The problem is that you create two subplots in your first line of code. Place a plt.show() directly after that to see that there the wrong axis is already plotted. This will interfere with your 3D graph later, which you simply place on top of it. You have to approach this differently:
from matplotlib import pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
##################################################################################
fig = plt.figure()
c = list(len(mydf)*'b')
for i in range(len(c)):
if mydf['percent'][i] > 0.05:
c[i] = 'r'
ax1 = fig.add_subplot(121)
ax1.bar(range(len(mydf['cdf'])), mydf['cdf'], color = c)
ax1.set_xticks(range(len(mydf['cdf'])))
ax1.set_xticklabels(list(mydf['3D_Attri']), rotation=45)
##################################################################################
ax2 = fig.add_subplot(122, projection='3d')
xs = mydf['sphere']
ys = mydf['cylinder']
zs = mydf['addition']
ax2.scatter(xs, ys, zs, zdir='z', s=20, c=c, depthshade=True)
ax2.set_xlabel('sphere')
ax2.set_ylabel('cylinder')
ax2.set_zlabel('addition')
plt.show()
Output from a toy data set:

Matplotlib 3D scatter plot with color gradient

How can I create a 3D plot with a color gradient for the points? See the example below, which works for a 2D scatter plot.
Edit (thanks to Chris): What I'm expecting to see from the 3D plot is a color gradient of the points ranging from red to green as in the 2D scatter plot.
What I see in the 3D scatter plot are only red points.
Solution: for some reasons (related to the gradient example I copied elsewhere) I set xrange to len-1, which messes everything in the 3D plot.
import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
# Create Map
cm = plt.get_cmap("RdYlGn")
x = np.random.rand(30)
y = np.random.rand(30)
z = np.random.rand(30)
#col = [cm(float(i)/(29)) for i in xrange(29)] # BAD!!!
col = [cm(float(i)/(30)) for i in xrange(30)]
# 2D Plot
fig = plt.figure()
ax = fig.add_subplot(111)
ax.scatter(x, y, s=10, c=col, marker='o')
# 3D Plot
fig = plt.figure()
ax3D = fig.add_subplot(111, projection='3d')
ax3D.scatter(x, y, z, s=10, c=col, marker='o')
plt.show()
Here is an example for 3d scatter with gradient colors:
import matplotlib.cm as cmx
from mpl_toolkits.mplot3d import Axes3D
def scatter3d(x,y,z, cs, colorsMap='jet'):
cm = plt.get_cmap(colorsMap)
cNorm = matplotlib.colors.Normalize(vmin=min(cs), vmax=max(cs))
scalarMap = cmx.ScalarMappable(norm=cNorm, cmap=cm)
fig = plt.figure()
ax = Axes3D(fig)
ax.scatter(x, y, z, c=scalarMap.to_rgba(cs))
scalarMap.set_array(cs)
fig.colorbar(scalarMap)
plt.show()
Of course, you can choose the scale to range between different values, like 0 and 1.
Following works: I can't figure out why yours doesn't. You should be able to set color as a sequence of RGBA floats, or just sequence of floats.
# Create Map
cm = plt.get_cmap("RdYlGn")
x = np.random.rand(30)
y = np.random.rand(30)
z = np.random.rand(30)
col = np.arange(30)
# 2D Plot
fig = plt.figure()
ax = fig.add_subplot(111)
ax.scatter(x, y, s=10, c=col, marker='o')
# 3D Plot
fig = plt.figure()
ax3D = fig.add_subplot(111, projection='3d')
p3d = ax3D.scatter(x, y, z, s=30, c=col, marker='o')
plt.show()
However, in help of scatter, I see the following, it may be related.
A :class:`matplotlib.colors.Colormap` instance or registered
name. If *None*, defaults to rc ``image.cmap``. *cmap* is
only used if *c* is an array of floats.

Categories