Test how well you can distinguish popular clustering algorithms. See if you know when to use centroid‑based versus tree‑based approaches.
K‑means clustering requires the analyst to pre‑specify ______.
the distance metric
a dendrogram
the number of clusters
the linkage method
Hierarchical clustering produces a ______ that visualizes cluster merges at each step.
dendrogram
elbow plot
heatmap
scree plot
Unlike agglomerative clustering, divisive hierarchical methods start with ______.
a fixed partition
random seed centroids
all data in one cluster
each point as its own cluster
The popular elbow method in K‑means examines variation explained versus ______.
iteration count
the number of clusters K
sample size
silhouette score
Ward’s method in hierarchical clustering minimises increases in ______ within clusters.
total squared error
Euclidean distance
average silhouette width
entropy
K‑means optimises ______ distance to the centroid.
Manhattan distance
cosine similarity
within‑cluster sum of squares
between‑cluster distance
K‑means struggles with clusters that are ______ shaped.
balanced
numeric
non‑spherical
small
Standardising variables before K‑means prevents dominance by ______.
noise points
missing values
categorical fields
large‑scale features
Hierarchical clustering can use complete, single, or ______ linkage definitions.
k‑median
average
centroid
random
Computational cost of agglomerative hierarchical clustering is roughly ______.
O(n)
O(n^3) time
O(n log n)
O(k n)
Starter
Good start—review k-means basics.
Solid
Strong grasp—polish a few finer points.
Expert!
You’ve mastered k-means segmentation.