Curriculum Engine

ML Libraries Mastery

NumPy, Pandas, Scikit-learn, PyTorch, TensorFlow, and practical ML library interview questions.

120 Learn MCQs
108 Practice MCQs
14 Full Tests

Learn Topics

Master concepts step-by-step through guided MCQs and detailed explanations.

Targeted Practice

Mix and match topics by difficulty to solidify your understanding.

Skill Assessments

Validate your expertise with timed, industry-standard tests.

NumPy Core

Arrays, dtypes, views vs copies, axis operations, and broadcasting — the two foundational NumPy topics. Tests both conceptual clarity and the edge cases that trip up experienced engineers.

12 Qs15 MINSmixed
Take Test

Pandas Essentials

Series, DataFrame indexing, dtypes, groupby, merge, and the traps that cause production bugs. Covers the distinction between views and copies and every variant of SettingWithCopyWarning.

12 Qs15 MINSmixed
Take Test

Pandas for ML + Visualization

Feature engineering, missing values, categorical encoding, time-series splits — then visualization: figure anatomy, confusion matrix heatmaps, learning curves, and publication-quality formatting.

12 Qs15 MINSmixed
Take Test

SciPy + Scikit-learn

Statistical testing, sparse matrices, distance calculations, then the full sklearn Estimator API — Pipeline, ColumnTransformer, cross-validation, and hyperparameter search. Essential for every ML interview.

12 Qs15 MINSmixed
Take Test

PyTorch Core

Tensors, autograd, computation graphs, then the full training loop: zero_grad, backward, mixed precision, model.train/eval modes. The two topics every ML engineer must master before any PyTorch interview.

12 Qs15 MINSmixed
Take Test

TensorFlow & Data Pipelines

Keras model lifecycle, tf.function tracing, BatchNorm training modes, then PyTorch DataLoader internals and tf.data pipelines — prefetch, shuffle buffers, DistributedSampler, and bottleneck detection.

12 Qs15 MINSmixed
Take Test

ML Libraries — Easy Interview Mock 1

Your first ML libraries interview simulation. Covers fundamentals across NumPy, Pandas, sklearn, PyTorch, and TensorFlow. Tests the concepts interviewers ask in every round — common traps included.

10 Qs12 MINSeasy
Take Test

ML Libraries — Easy Interview Mock 2

Second easy-difficulty interview simulation. A fresh set of fundamental questions across all 12 ML library topics. Complete both Easy mocks before moving to Medium for maximum coverage.

10 Qs12 MINSeasy
Take Test

ML Libraries — Medium Interview Mock 1

First medium-difficulty mock interview. Covers applied reasoning — broadcasting bugs, leakage in CV pipelines, groupby transform vs agg, gradient accumulation, and tf.data prefetch strategy. FAANG-style questions.

12 Qs18 MINSmedium
Take Test

ML Libraries — Medium Interview Mock 2

Second medium-difficulty mock. A different angle on applied reasoning: view flags, left vs inner merge semantics, apply bottleneck, pairplot scale, Welch vs Student t-test, DataLoader Windows crash, and Batch Norm training mode.

12 Qs18 MINSmedium
Take Test

ML Libraries — Hard Interview Mock 1

First hard-difficulty mock. Covers int32 overflow, fancy indexing diagonal trap, temporal leakage with KFold, BFGS finite differences, custom sklearn transformer protocol, double backward error, full reproducibility checklist, GradientTape disconnected graph, and DistributedSampler epoch seeding.

15 Qs25 MINShard
Take Test

ML Libraries — Hard Interview Mock 2

Second hard-difficulty mock. Covers reshape copy after transpose, covariance centering error, broadcast_to read-only trap, non-monotonic index slicing, concat O(N²) anti-pattern, ROC below diagonal diagnosis, nested CV selection bias, DataParallel .module access, NaN loss debugging, @tf.function retrace, scaler leakage in cross-validation, and KS test for production drift.

15 Qs25 MINShard
Take Test

ML Libraries Elite — Memory, Graphs & Production Traps

Elite assessment for senior engineers. Tests deep internals: NumPy memory layout and overflow, broadcast_to read-only semantics, fancy indexing paired vs grid selection, covariance centering, pandas Copy-on-Write, concat O(N²), sparse matrix memory math, custom sklearn clone protocol, nested CV bias, PyTorch double backward, full reproducibility sources, GradientTape disconnection, @tf.function retrace cost, and einsum contraction. Expect multi-step reasoning on every question.

18 Qs35 MINShard
Take Test

ML Libraries Elite — Debugging, Architecture & Edge Cases

Second elite assessment. A production-failure focused gauntlet: non-monotonic index slicing, ROC curve inversion diagnosis, twinx legend bug, Pipeline double-underscore naming, DataParallel .module access, NaN loss systematic diagnosis, DistributedSampler epoch seeding, padded_batch for variable-length sequences, scaler leakage in cross-validation, seaborn mask upper triangle, KS test for production drift, FeatureUnion vs ColumnTransformer, model.half() dtype mismatch, Keras save/load inference drift, TFRecord cache strategy, Categorical dtype add_categories trap, df.query @variable scope failure, and DataLoader __len__ contract.

18 Qs35 MINShard
Take Test