DistillPrep
Python
GenAI
GenAI Frameworks
NLP
Deep Learning
Machine Learning
ML Libraries
Statistics
SQL
MLOps
Cloud
System Design
Blog
Learn
Practice
Test
Live Engine
Select Topic
ML Fundamentals
(15)
Linear Regression
(15)
Logistic Regression
(15)
Decision Trees
(15)
Random Forest
(15)
Gradient Boosting
(15)
Support Vector Machines
(15)
K Nearest Neighbors
(15)
Naive Bayes
(14)
Pca Dimensionality Reduction
(13)
Clustering
(13)
Anomaly Detection
(12)
Ensemble Methods
(12)
Model Evaluation And Metrics
(12)
Bias Variance Tradeoff
(11)
Regularization
(10)
Feature Selection And Engineering
(12)
easy
Clustering
K-means clustering is run on a 2D dataset. After 100 iterations, the cluster assignments stop changing. A junior analyst says "the algorithm found the optimal clustering." What is wrong with this statement?
A
K-means always finds the global optimum — the statement is correct
B
K-means is guaranteed to converge (cluster assignments stop changing) but only to a local minimum of the within-cluster sum of squares (WCSS) objective — different random initializations can produce different converged solutions; "optimal" requires the global minimum, which K-means cannot guarantee
C
K-means convergence requires exactly 1,000 iterations — convergence at 100 means an error occurred
D
K-means minimizes between-cluster variance, not within-cluster variance, so convergence doesn't relate to optimality
Previous
Back
Next