Skip to content

Commit 9435b5b

Browse files
committed
fix: increase text sizes, fix bibtex spacing, add loading indicator for embeddings, add separator
1 parent df55d49 commit 9435b5b

File tree

3 files changed

+54
-27
lines changed

3 files changed

+54
-27
lines changed

index.html

Lines changed: 13 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ <h1 class="hero-title">MOOZY</h1>
5454
<div class="stat-cards-dark">
5555
<div class="stat-card-dark">
5656
<span class="stat-number-dark">85.77M</span>
57-
<span class="stat-label-dark">parameters<br>(14&times; smaller than GigaPath)</span>
57+
<span class="stat-label-dark">parameters</span>
5858
</div>
5959
<div class="stat-card-dark">
6060
<span class="stat-number-dark">77,134</span>
@@ -77,6 +77,7 @@ <h1 class="hero-title">MOOZY</h1>
7777
</section>
7878

7979
<!-- ============ QUICK START ============ -->
80+
<hr style="border:none; border-top:1px solid #e5e7eb; max-width:200px; margin:0 auto;">
8081
<section class="section">
8182
<div class="container" style="max-width:700px">
8283
<h2 class="section-heading">Quick Start</h2>
@@ -99,7 +100,7 @@ <h2 class="section-heading">Quick Start</h2>
99100
<section class="section section-alt">
100101
<div class="container" style="max-width:850px">
101102
<h2 class="section-heading">Abstract</h2>
102-
<p style="line-height:1.7; color: var(--text-secondary)">
103+
<p style="line-height:1.8; font-size:1.1rem;">
103104
Computational pathology needs whole-slide image (WSI) foundation models that transfer across diverse clinical tasks, yet current approaches remain largely slide-centric, often depend on private data and expensive paired-report supervision, and do not explicitly model relationships among multiple slides from the same patient. We present MOOZY, a patient-first pathology foundation model in which the patient case, not the individual slide, is the core unit of representation. MOOZY explicitly models dependencies across all slides from the same patient via a case transformer during pretraining, combining multi-stage open self-supervision with scaled low-cost task supervision. In Stage&nbsp;1, we pretrain a vision-only slide encoder on 77,134 public slide feature grids using masked self-distillation. In Stage&nbsp;2, we align these representations with clinical semantics using a case transformer and multi-task supervision over 333 tasks from 56 public datasets, including 205 classification and 128 survival tasks across four endpoints. Across eight held-out tasks with five-fold frozen-feature probe evaluation, MOOZY achieves best or tied-best performance on most metrics and improves macro averages over TITAN by +7.37%, +5.50%, and +7.83% and over PRISM by +8.83%, +10.70%, and +9.78% for weighted F1, weighted ROC-AUC, and balanced accuracy, respectively.
104105
</p>
105106
</div>
@@ -134,7 +135,7 @@ <h2 class="section-heading">Training Data Scale</h2>
134135
style="width:100%; border-radius:8px;">
135136
</div>
136137
<div class="column is-6">
137-
<p style="line-height:1.7">
138+
<p style="line-height:1.8; font-size:1.1rem;">
138139
MOOZY is trained entirely on public data. Stage&nbsp;1 uses 77,134 slide feature grids
139140
(53,286 at 20&times; and 23,848 at 40&times;) extracted from ~1.67 billion patches across ~31.8&nbsp;TB of raw WSI data.
140141
Stage&nbsp;2 uses 41,089 supervised cases across 333 tasks from 56 datasets &mdash;
@@ -281,7 +282,7 @@ <h2 class="section-heading">Results</h2>
281282
<section class="section">
282283
<div class="container">
283284
<h2 class="section-heading">Where Does MOOZY Look?</h2>
284-
<p style="text-align:center; color: var(--text-secondary); margin-bottom:1.5rem; max-width:750px; margin-left:auto; margin-right:auto;">
285+
<p style="text-align:center; margin-bottom:1.5rem; max-width:750px; margin-left:auto; margin-right:auto; font-size:1.1rem; line-height:1.8;">
285286
A board-certified pathologist reviewed attention maps across 20 representative WSIs and five encoders.
286287
MOOZY achieved the lowest mean semantic gap score (1.00) and near-balanced tumor vs. non-tumor attention (2.63),
287288
suggesting broad, diagnostically relevant coverage.
@@ -308,7 +309,7 @@ <h2 class="section-heading">Where Does MOOZY Look?</h2>
308309
<span class="dot"></span>
309310
</div>
310311
<!-- Captions per slide -->
311-
<div class="carousel-caption fig-caption" style="display:block">Lung adenocarcinoma. MOOZY and TITAN: balanced, comprehensive coverage. CHIEF and Madeleine: cancer-biased with semantic gaps.</div>
312+
<div class="carousel-caption fig-caption" style="display:block">Attention-map comparison on a lung adenocarcinoma slide. MOOZY and TITAN: balanced, comprehensive coverage (shift 3, gap 1). PRISM: balanced shift with moderate gaps (shift 3, gap 2). CHIEF and Madeleine: cancer-biased with frequent semantic gaps (shift 2, gap 3).</div>
312313
<div class="carousel-caption fig-caption" style="display:none">Attention comparison across five encoders on a representative WSI.</div>
313314
<div class="carousel-caption fig-caption" style="display:none">Attention comparison across five encoders on a representative WSI.</div>
314315
<div class="carousel-caption fig-caption" style="display:none">Attention comparison across five encoders on a representative WSI.</div>
@@ -321,7 +322,7 @@ <h2 class="section-heading">Where Does MOOZY Look?</h2>
321322
<section class="section section-alt" id="embeddings-section">
322323
<div class="container">
323324
<h2 class="section-heading">Embedding Quality</h2>
324-
<p style="text-align:center; color: var(--text-secondary); margin-bottom:1.5rem; max-width:700px; margin-left:auto; margin-right:auto;">
325+
<p style="text-align:center; margin-bottom:1.5rem; max-width:700px; margin-left:auto; margin-right:auto; font-size:1.1rem; line-height:1.8;">
325326
Dimensionality reduction of slide embeddings from four encoders. MOOZY shows the clearest class separation on cancer-type tasks.
326327
</p>
327328

@@ -335,7 +336,8 @@ <h2 class="section-heading">Embedding Quality</h2>
335336
<button class="toggle-btn toggle-task" data-task="tcga_cancer_type">TCGA Cancer Type</button>
336337
</div>
337338

338-
<div class="embedding-grid">
339+
<div id="emb-spinner" style="display:none; text-align:center; margin-bottom:0.5rem; font-size:0.85rem; color:#555;">Loading...</div>
340+
<div class="embedding-grid" style="transition: opacity 0.2s;">
339341
<div class="emb-cell">
340342
<img class="emb-img" data-encoder="moozy" src="static/images/embeddings/umap_cptac_cancer_type_moozy.webp" alt="MOOZY embedding" loading="lazy">
341343
<p class="emb-label">MOOZY</p>
@@ -360,16 +362,11 @@ <h2 class="section-heading">Embedding Quality</h2>
360362
<section class="section section-alt">
361363
<div class="container" style="max-width:700px">
362364
<h2 class="section-heading">Citation</h2>
363-
<div class="bibtex-block">
364-
<button class="copy-btn">Copy</button>
365-
<code>@article{moozy2026,
366-
title = {MOOZY: A Patient-First Foundation Model for
367-
Computational Pathology},
368-
author = {Kotp, Yousef and Trinh, Vincent Quoc-Huy
369-
and Pal, Christopher and Hosseini, Mahdi S.},
365+
<div class="bibtex-block"><button class="copy-btn">Copy</button><code>@article{moozy2026,
366+
title = {MOOZY: A Patient-First Foundation Model for Computational Pathology},
367+
author = {Kotp, Yousef and Trinh, Vincent Quoc-Huy and Pal, Christopher and Hosseini, Mahdi S.},
370368
year = {2026}
371-
}</code>
372-
</div>
369+
}</code></div>
373370
</div>
374371
</section>
375372

static/css/index.css

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,8 @@
33
:root {
44
--accent: #222;
55
--accent-light: rgba(0, 0, 0, 0.04);
6-
--text-primary: #111;
7-
--text-secondary: #333;
6+
--text-primary: #000;
7+
--text-secondary: #111;
88
--bg-alt: #f8f9fa;
99
--border: #d1d5db;
1010
}
@@ -20,21 +20,21 @@ body {
2020

2121
/* Hero */
2222
.hero-title {
23-
font-size: 2.8rem;
23+
font-size: 3.2rem;
2424
font-weight: 700;
2525
letter-spacing: -0.02em;
2626
color: #000;
2727
}
2828

2929
.hero-subtitle {
30-
font-size: 1.1rem;
30+
font-size: 1.3rem;
3131
color: var(--text-secondary);
3232
max-width: 700px;
3333
margin: 0 auto;
3434
}
3535

3636
.author-list {
37-
font-size: 1rem;
37+
font-size: 1.15rem;
3838
margin-bottom: 0.5rem;
3939
}
4040

@@ -50,7 +50,7 @@ body {
5050
}
5151

5252
.affiliation-list {
53-
font-size: 0.85rem;
53+
font-size: 1rem;
5454
color: var(--text-secondary);
5555
}
5656

@@ -66,10 +66,10 @@ body {
6666
display: inline-flex;
6767
align-items: center;
6868
gap: 0.35rem;
69-
padding: 0.45rem 1rem;
69+
padding: 0.5rem 1.2rem;
7070
border: 1px solid var(--border);
7171
border-radius: 6px;
72-
font-size: 0.9rem;
72+
font-size: 1.05rem;
7373
color: var(--text-primary);
7474
text-decoration: none;
7575
transition: border-color 0.2s, background 0.2s;

static/js/index.js

Lines changed: 33 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -86,23 +86,53 @@ function initEmbeddingToggles() {
8686
let method = "umap";
8787
let task = "cptac_cancer_type";
8888

89-
// Preload all embedding images
89+
// Preload all embedding images into a cache
9090
const methods = ["umap", "tsne"];
9191
const tasks = ["cptac_cancer_type", "organs", "tcga_cancer_type"];
9292
const encoders = ["moozy", "titan", "madeleine", "prism"];
93+
const cache = {};
9394
methods.forEach((m) => {
9495
tasks.forEach((t) => {
9596
encoders.forEach((e) => {
97+
const key = `${m}_${t}_${e}`;
9698
const img = new Image();
97-
img.src = `static/images/embeddings/${m}_${t}_${e}.webp`;
99+
img.src = `static/images/embeddings/${key}.webp`;
100+
cache[key] = img;
98101
});
99102
});
100103
});
101104

105+
const grid = section.querySelector(".embedding-grid");
106+
const spinner = document.getElementById("emb-spinner");
107+
102108
function update() {
109+
let loaded = 0;
110+
const total = images.length;
111+
spinner.style.display = "block";
112+
grid.style.opacity = "0.4";
113+
103114
images.forEach((img) => {
104115
const encoder = img.dataset.encoder;
105-
img.src = `static/images/embeddings/${method}_${task}_${encoder}.webp`;
116+
const key = `${method}_${task}_${encoder}`;
117+
const cached = cache[key];
118+
119+
if (cached && cached.complete) {
120+
img.src = cached.src;
121+
loaded++;
122+
if (loaded === total) {
123+
grid.style.opacity = "1";
124+
spinner.style.display = "none";
125+
}
126+
} else {
127+
img.onload = () => {
128+
loaded++;
129+
if (loaded === total) {
130+
grid.style.opacity = "1";
131+
spinner.style.display = "none";
132+
}
133+
};
134+
img.src = `static/images/embeddings/${key}.webp`;
135+
}
106136
});
107137
}
108138

0 commit comments

Comments
 (0)