What It Is

Emergent abilities are capabilities that appear sharply (seemingly discontinuously) at some model scale and are absent in smaller models. The term was popularized by Wei et al. (2022) in the context of few-shot reasoning tasks.

Why It Matters

Emergence challenges the assumption that capability scales smoothly with compute — certain abilities appear to “switch on” at thresholds, making it difficult to predict which capabilities a model will have before training at scale. Chain-of-thought reasoning and arithmetic are canonical examples.

How It Works

Empirically: small models (below ~10B parameters) show near-random performance on certain tasks; performance jumps to well-above-chance at larger scales with the same prompting. Whether this is a genuine phase transition or an artifact of nonlinear evaluation metrics is debated.

Key Sources

Open Questions

  • Are emergent abilities genuine phase transitions or metric artifacts?
  • Can emergence be predicted before training?