You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# ContextLab GPT-2 {metadata['full_name']} Stylometry Model
182
189
183
190
## Overview
184
191
185
-
This model is a GPT-2 language model trained exclusively on the complete works of **{metadata['full_name']}** ({metadata['years']}). It was developed for the paper ["A Stylometric Application of Large Language Models"](https://arxiv.org/abs/2510.21958) (Stropkay et al., 2025).
192
+
This model is a GPT-2 language model trained exclusively on **{count_training_books(author)} books by {metadata['full_name']}** ({metadata['years']}). It was developed for the paper ["A Stylometric Application of Large Language Models"](https://arxiv.org/abs/2510.21958) (Stropkay et al., 2025).
186
193
187
-
The model captures {metadata['full_name']}'s unique writing style through intensive training on their complete corpus. By learning the statistical patterns, vocabulary, syntax, and thematic elements characteristic of {author.capitalize()}'s writing, this model enables:
194
+
The model captures {metadata['full_name']}'s unique writing style through intensive training on their corpus. By learning the statistical patterns, vocabulary, syntax, and thematic elements characteristic of {author.capitalize()}'s writing, this model enables:
188
195
189
196
- **Text generation** in the authentic style of {metadata['full_name']}
190
197
- **Authorship attribution** through cross-entropy loss comparison
0 commit comments