Juan Uriagereka
Juan Uriagereka
How the brain recycled memory circuits for language: An evolutionary perspective on the ROSE model [0.03%]
从进化角度对ROSE模型进行探讨:大脑如何重复利用记忆线路来进行语言的转换
Edward Ruoyang Shi
Edward Ruoyang Shi
Autonomous semantics and syntax on-demand in neurocomputational models of language [0.03%]
神经计算语言模型中的自主语义和按需句法研究
Giosuè Baggio
Giosuè Baggio
ROSE is a rare example of a neurocomputational model of language that attempts, and partly manages, to align a formal theory of syntax and parsing with an oscillations-based 'neural code' that could implement the required operations. ROSE s...
Grounding the computational principles of language in neurobiology requires cross-modal and cross-linguistic data [0.03%]
将语言计算原理建立在神经生物学基础之上需要跨模式和跨语言数据
Patrick C Trettenbrein
Patrick C Trettenbrein
Murphy's discussion (2025) of his recent ROSE model includes explicit linking hypotheses connecting computational, algorithmic, and implementational levels in the study of language and its neurobiological basis. Here, I argue that establish...
J Brendan Ritchie,Susan G Wardle,Maryam Vaziri-Pashkam et al.
J Brendan Ritchie et al.
A wealth of studies report evidence that occipitotemporal cortex tessellates into 'category-selective' brain regions that are apparently specialized for representing ecologically important visual stimuli like faces, bodies, scenes, and tool...
Giovanni Federico,François Osiurak,Maria A Brandimonte et al.
Giovanni Federico et al.
Understanding how the human brain generates, utilizes, and adapts to technology is one of our most urgent scientific questions today. Recent advances in cognitive neuroscience reveal a complex neurocognitive structure that underpins human i...
Closing the box [0.03%]
盖棺定论
Thomas Parr,Giovanni Pezzulo,Karl Friston
Thomas Parr
We were grateful for the level of engagement and insight from the commentaries on our discussion article, and for the opportunity to pick up on some of the common themes in what follows. Several commentaries focused upon the degeneracy in t...
Noor Sajid,Johan Medrano
Noor Sajid
Parr et al., 2025 examines how auto-regressive and deep temporal models differ in their treatment of non-Markovian sequence modelling. Building on this, we highlight the need for dissociating model architectures-i.e., how the predictive dis...
Elliot Murphy
Elliot Murphy
Processing natural language syntax requires a negotiation between symbolic and subsymbolic representations. Building on the recent representation, operation, structure, encoding (ROSE) neurocomputational architecture for syntax that scales ...
Alexander Bernard Kiefer
Alexander Bernard Kiefer
Despite the overtly discrete nature of language, the use of semantic embedding spaces is pervasive in modern computational linguistics and machine learning for natural language. I argue that this is intelligible if language is viewed as an ...