Boost accuracy by combining weak and strong learners the right way. Understand how bagging, boosting, and stacking differ and when to use each.
Bagging primarily aims to reduce ______ by averaging many models.
variance
class overlap
feature collinearity
bias
Random Forest is an example of ______.
bagging of decision trees with feature subsampling
single deep tree without randomness
stacking with a meta‑learner
boosting with shrinkage
Boosting methods like Gradient Boosting build models ______.
independently in parallel
using k‑means initial clusters
only on balanced datasets
sequentially, correcting previous errors
Stacking combines base models by training a ______ on their out‑of‑fold predictions.
PCA transformer
distance matrix
meta‑learner
single decision stump
A simple yet strong ensemble baseline that averages model outputs with fixed weights is called ______.
dropout
blending
bagging
annealing
In stacking, to avoid target leakage you must use ______ predictions from base models.
in‑sample
test‑set only
out‑of‑fold
shuffled‑label
Compared to bagging, boosting tends to reduce ______ more aggressively.
overlap of classes
class imbalance
variance only
bias
For tabular data with mixed dtypes, strong boosting libraries include ______.
ResNet and U‑Net
ARIMA and ETS
XGBoost, LightGBM, and CatBoost
k‑means++ only
When combining probabilistic classifiers, averaging calibrated probabilities is safer than averaging raw ______.
IDs
features
logits
indices
A quick sign that bagging will help is that your base learner is ______.
high‑variance (unstable)
perfectly linear
deterministic and low‑variance
trained on infinite data
Starter
Refresh the fundamentals of bagging, boosting, and stacking.
Solid
Nice—experiment with meta‑learners and calibration for further gains.
Expert!
Excellent—your ensembles are well‑tuned and leakage‑free.