Testing the spin-bath view of self-attention: A Hamiltonian analysis of GPT-2 transformer
{{output}}
The recently proposed physics-based framework by Huo and Johnson [arXiv:2504.04600]. models the attention mechanism of large language models as an interacting two-body spin system, offering a first-principles explanation for phenomena like repetition and bias.... ...