You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
high-resolution insights into conceptual knowledge from short multiple-choice
155
155
quizzes. <em>Nature Communications</em>: In press.</li>
156
-
<li><spanclass="underline">Stropkay HF</span>, Chen J, Latifi MJ, Rockmore DN, <strong>Manning JR</strong> (2025)) A stylometric application of large language models. <em>arXiv</em>: 2510.21958.</li>
156
+
<li><spanclass="underline">Stropkay HF</span>, Chen J, Latifi MJ, Rockmore DN, <strong>Manning JR</strong> (2025) A stylometric application of large language models. <em>arXiv</em>: 2510.21958.</li>
157
157
<li><strong>Manning JR</strong> (2025) Why we're so preoccupied by the past. <em>Scientific American</em>, online.</li>
158
158
<li><em>Owen LLW</em>, <strong>Manning JR</strong> (2024) High-level cognition is supported by
0 commit comments