Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
48 commits
Select commit Hold shift + click to select a range
9c9ab67
creating skeleton for benchmarks starting with creating some test_que…
Rahban1 May 22, 2025
9018d75
created functions for measuring search performance
Rahban1 May 22, 2025
71b1e6b
Merge branch 'JuliaDocs:master' into benchmarks
Rahban1 Jun 11, 2025
7b828ab
added comments and changes some tests
Rahban1 Jun 11, 2025
c533f44
search mockup
Rahban1 Jun 11, 2025
fbc226e
creating wrapper.js as seperate file
Rahban1 Jun 14, 2025
d750e7d
only install the exact version of minisearch
Rahban1 Jun 27, 2025
91d3001
ran runic for linting
Rahban1 Jun 27, 2025
044506c
make command
Rahban1 Jun 28, 2025
56dc16a
changelog
Rahban1 Jun 28, 2025
3dd42ed
Merge branch 'master' into benchmarks
Rahban1 Jun 28, 2025
9beb5d2
typo in comments
Rahban1 Jul 1, 2025
8b99e46
running benchmarks on ci
Rahban1 Jul 2, 2025
3e1954d
building the docs before running benchmarks
Rahban1 Jul 2, 2025
fd7891b
fixing the correct position of search index
Rahban1 Jul 2, 2025
5f9f074
prettify the output
Rahban1 Jul 4, 2025
825ed0c
add color in tables and add PrettyTables in Project.toml
Rahban1 Jul 4, 2025
72e616d
added crayons in Project.toml
Rahban1 Jul 4, 2025
76a1201
change to @__DIR__ for relative paths
Rahban1 Jul 4, 2025
6beae8c
use the search index used in prod
Rahban1 Jul 5, 2025
7e8ea88
new queries
Rahban1 Jul 5, 2025
4ccb192
full benchmark report is uploaded as an artifact and moved the benchm…
Rahban1 Jul 5, 2025
da1f773
ran runic for linting
Rahban1 Jul 5, 2025
73144dc
added node setup in CI
Rahban1 Jul 5, 2025
0865207
write js code to temp file to get rid of error
Rahban1 Jul 5, 2025
8938733
ran runic for linting
Rahban1 Jul 5, 2025
b7c2599
removed the redundant npm install logic from test/search/real_search.…
Rahban1 Jul 5, 2025
77d08f2
ran the temp file within test/search so it find the minisearch module
Rahban1 Jul 6, 2025
ef382cb
ran runic for linting
Rahban1 Jul 6, 2025
0f32b37
now the values are compared to a fix set of values everytime it runs
Rahban1 Jul 6, 2025
2d08814
ran runic for linting
Rahban1 Jul 7, 2025
7ca4173
Merge branch 'master' into benchmarks
Rahban1 Jul 9, 2025
cfe1608
Change PrettyTables.jl and Crayons from deps of package to test deps
Rahban1 Jul 9, 2025
835513d
version's source of truth is now in HTMLWriter.jl
Rahban1 Jul 10, 2025
2b685cf
Merge branch 'master' into benchmarks
Rahban1 Jul 10, 2025
f4eaeda
Ran prettier
Rahban1 Jul 10, 2025
6a28a67
removed running in matrix in CI.yml
Rahban1 Jul 10, 2025
341fc76
adding NodeJLL
Rahban1 Jul 10, 2025
dfbdb29
reverting the nodeJLL changes
Rahban1 Jul 12, 2025
17348d1
Fix CI by creating package.json on the fly
Rahban1 Jul 13, 2025
dd2608e
Changes linux command to fetch the minisearch version
Rahban1 Jul 13, 2025
dc68379
Instantiate main project environment
Rahban1 Jul 13, 2025
3f32001
chage test/search to use [sources]
mortenpi Jul 13, 2025
8a4d747
Remove CHANGELOG.md as it is not a user-facing change
Rahban1 Jul 14, 2025
f9212d8
Not exclude search.js from Documenter.js rather just replacing the pl…
Rahban1 Jul 14, 2025
f8c12de
Uses Documenter in real_search.jl and remove the making of package.js…
Rahban1 Jul 14, 2025
ec80a78
ran runic for linting
Rahban1 Jul 14, 2025
373f3ec
Add prefix to specifically mention to install the npm packages in the…
Rahban1 Jul 15, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 36 additions & 0 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -213,6 +213,42 @@ jobs:
with:
name: PDF build logs
path: ${{ github.workspace }}/latex-debug-logs
- name: Upload search index
if: ${{ matrix.format == 'html' }}
uses: actions/upload-artifact@v4
with:
name: search-index
path: docs/build/search_index.js

benchmarks:
name: Search Benchmarks
runs-on: ubuntu-latest
needs: docs
steps:
- uses: actions/checkout@v4
- uses: julia-actions/setup-julia@v2
with:
version: 1
arch: x64
show-versioninfo: true
- uses: julia-actions/cache@v2
- uses: julia-actions/julia-buildpkg@v1
- name: Instantiate main project environment
run: julia --project=. -e 'using Pkg; Pkg.instantiate()'
- name: Download search index
uses: actions/download-artifact@v4
with:
name: search-index
path: docs/build
- name: Run search benchmarks
run: make search-benchmarks
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Upload search benchmark results
uses: actions/upload-artifact@v4
with:
name: search-benchmark-results
path: test/search/search_benchmark_results_*.txt

linkcheck:
name: "Linkcheck: online tests"
Expand Down
10 changes: 9 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,12 @@ install-runic:
test:
${JULIA} --project -e 'using Pkg; Pkg.test()'

search-benchmarks: test/search/Manifest.toml
${JULIA} --project=test/search test/search/run_benchmarks.jl

test/search/Manifest.toml: test/search/Project.toml
${JULIA} --project=test/search -e'using Pkg; Pkg.instantiate()'

clean:
rm -f Manifest.toml
rm -f docs/Manifest.toml
Expand All @@ -49,6 +55,7 @@ clean:
rm -rf test/plugins/build
rm -rf test/quietly-logs
rm -rf test/workdir/builds
rm -f test/search/search_benchmark_results_*.txt


help:
Expand All @@ -60,7 +67,8 @@ help:
@echo " - make format-julia: formats the Julia source code with Runic"
@echo " - make install-runic: installs Runic.jl into the @runic shared Julia environment (for make format)"
@echo " - make test: run the tests"
@echo " - make search-benchmarks: run search functionality benchmarks"
@echo " - make themes: compile Documenter's native CSS themes"
@echo " - make clean: remove generated files"

.PHONY: default docs-instantiate themes help changelog docs test format-julia install-runic
.PHONY: default docs-instantiate themes help changelog docs test format-julia install-runic search-benchmarks
4 changes: 3 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -55,9 +55,11 @@ Unicode = "1.6"
julia = "1.6"

[extras]
Crayons = "a8cc5b0e-0ffa-5ad4-8c14-923d3ee1735f"
DocInventories = "43dc2714-ed3b-44b5-b226-857eda1aa7de"
PrettyTables = "08abe8d2-0d0c-5749-adfa-8a2ac140af0d"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
UUIDs = "cf7118a7-6976-5b1a-9a39-7adc72f591a4"

[targets]
test = ["Random", "UUIDs", "DocInventories"]
test = ["Random", "UUIDs", "DocInventories", "Crayons", "PrettyTables"]
11 changes: 7 additions & 4 deletions assets/html/js/search.js
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ update_search

function worker_function(documenterSearchIndex, documenterBaseURL, filters) {
importScripts(
"https://cdn.jsdelivr.net/npm/minisearch@6.1.0/dist/umd/index.min.js",
"https://cdn.jsdelivr.net/npm/minisearch@__MINISEARCH_VERSION__/dist/umd/index.min.js",
);

let data = documenterSearchIndex.map((x, key) => {
Expand Down Expand Up @@ -606,11 +606,14 @@ function waitUntilSearchIndexAvailable() {
// has finished loading and documenterSearchIndex gets defined.
// So we need to wait until the search index actually loads before setting
// up all the search-related stuff.
if (typeof documenterSearchIndex !== "undefined") {
if (
typeof documenterSearchIndex !== "undefined" &&
typeof $ !== "undefined"
) {
runSearchMainCode();
} else {
console.warn("Search Index not available, waiting");
setTimeout(waitUntilSearchIndexAvailable, 1000);
console.warn("Search Index or jQuery not available, waiting");
setTimeout(waitUntilSearchIndexAvailable, 100);
}
}

Expand Down
14 changes: 13 additions & 1 deletion src/html/HTMLWriter.jl
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,8 @@
const THEMES = ["documenter-light", "documenter-dark", "catppuccin-latte", "catppuccin-frappe", "catppuccin-macchiato", "catppuccin-mocha"]
"The root directory of the HTML assets."
const ASSETS = normpath(joinpath(@__DIR__, "..", "..", "assets", "html"))
"The version of minisearch to use."
const MINISEARCH_VERSION = "6.1.0"
"The directory where all the Sass/SCSS files needed for theme building are."
const ASSETS_SASS = joinpath(ASSETS, "scss")
"Directory for the compiled CSS files of the themes."
Expand Down Expand Up @@ -802,7 +804,17 @@
for filename in readdir(joinpath(ASSETS, "js"))
path = joinpath(ASSETS, "js", filename)
endswith(filename, ".js") && isfile(path) || continue
push!(r, JSDependencies.parse_snippet(path))

content = read(path, String)
if filename == "search.js"
if isfile(joinpath(doc.user.source, "assets", "search.js"))
@warn "not embedding 'search.js', provided by the user."
continue

Check warning on line 812 in src/html/HTMLWriter.jl

View check run for this annotation

Codecov / codecov/patch

src/html/HTMLWriter.jl#L811-L812

Added lines #L811 - L812 were not covered by tests
end
content = replace(content, "__MINISEARCH_VERSION__" => MINISEARCH_VERSION)
end

push!(r, JSDependencies.parse_snippet(IOBuffer(content)))
end
JSDependencies.verify(r; verbose = true) || error("RequireJS declaration is invalid")
JSDependencies.writejs(joinpath(doc.user.build, "assets", "documenter.js"), r)
Expand Down
11 changes: 11 additions & 0 deletions test/search/Project.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
[deps]
Crayons = "a8cc5b0e-0ffa-5ad4-8c14-923d3ee1735f"
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
JSON = "682c06a0-de6a-54ab-a142-c8b1cf79cde6"
NodeJS_22_jll = "8fca9ca2-e7a1-5ccf-8c05-43be5a78664f"
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
PrettyTables = "08abe8d2-0d0c-5749-adfa-8a2ac140af0d"

[sources]
Documenter = { path="../.." }
113 changes: 113 additions & 0 deletions test/search/evaluate.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
# Represents the evaluation results for a single search query
struct QueryResult
query::String
precision::Float64
recall::Float64
f1::Float64
expected::Vector{String}
actual::Vector{String}
# Raw integer values used in calculations
relevant_count::Int # Number of relevant documents found
total_retrieved::Int # Total number of documents retrieved
total_relevant::Int # Total number of relevant documents
end

# Aggregates evaluation results across multiple search queries
struct EvaluationResults
individual_results::Vector{QueryResult}
average_precision::Float64
average_recall::Float64
average_f1_score::Float64
# Raw integer values for overall evaluation
total_relevant_found::Int # Total number of relevant documents found across all queries
total_documents_retrieved::Int # Total number of documents retrieved across all queries
total_relevant_documents::Int # Total number of relevant documents across all queries
end

# Calculates precision for search results against expected documents
# Precision = (relevant documents found) / (total documents retrieved)
# Returns precision score, count of relevant documents found, and total documents retrieved
function calculate_precision(results, expected_docs)
if isempty(results)
return 0.0, 0, 0
end

relevant_count = length(intersect(results, expected_docs))
total_retrieved = length(results)

return relevant_count / total_retrieved, relevant_count, total_retrieved
end

# Calculates recall for search results against expected documents
# Recall = (relevant documents found) / (total relevant documents)
# Measures completeness of the search results - how many of the relevant documents were found
# Returns recall score, count of relevant documents found, and total relevant documents
function calculate_recall(results, expected_docs)
if isempty(expected_docs)
return 1.0, 0, 0
end

found_count = length(intersect(results, expected_docs))
total_relevant = length(expected_docs)

return found_count / total_relevant, found_count, total_relevant
end

# Calculates F1 score from precision and recall values
# F1 = 2 * (precision * recall) / (precision + recall)
# Combines precision and recall into a single score, giving equal weight to both metrics
# Returns 0.0 if both precision and recall are 0
function calculate_f1(precision, recall)
if precision + recall == 0
return 0.0
end

return 2 * (precision * recall) / (precision + recall)
end

# Evaluates a single search query using the provided search function
# Returns a QueryResult containing precision, recall, and F1 metrics
function evaluate_query(search_function, query::TestQuery)
results = search_function(query.query)

precision, relevant_count, total_retrieved = calculate_precision(results, query.expected_docs)
recall, found_count, total_relevant = calculate_recall(results, query.expected_docs)
f1 = calculate_f1(precision, recall)

return QueryResult(
query.query,
precision,
recall,
f1,
query.expected_docs,
results,
relevant_count,
total_retrieved,
total_relevant
)
end

# Evaluates multiple search queries and aggregates the results
# Returns an EvaluationResults containing average metrics across all queries
function evaluate_all(search_function, queries)
results = [evaluate_query(search_function, q) for q in queries]

avg_precision = mean([r.precision for r in results])
avg_recall = mean([r.recall for r in results])
avg_f1 = mean([r.f1 for r in results])

# Calculate total raw values across all queries
total_relevant_found = sum(r.relevant_count for r in results)
total_documents_retrieved = sum(r.total_retrieved for r in results)
total_relevant_documents = sum(r.total_relevant for r in results)

return EvaluationResults(
results,
avg_precision,
avg_recall,
avg_f1,
total_relevant_found,
total_documents_retrieved,
total_relevant_documents
)
end
61 changes: 61 additions & 0 deletions test/search/real_search.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
using JSON
using NodeJS_22_jll
using Documenter

# Load the real search index from test examples (already built!)
function load_real_search_index()
# Use the example search index that's already built and tested
search_index_path = joinpath(@__DIR__, "../../docs/build/search_index.js")

if !isfile(search_index_path)
error("Search index not found at: $search_index_path")
end

# Read and parse the JavaScript file
content = read(search_index_path, String)

# Find the JSON data after "var documenterSearchIndex = "
json_start = findfirst("var documenterSearchIndex = ", content)
if json_start === nothing
error("Invalid search index format: missing variable declaration")
end

# Extract JSON content (everything after the variable declaration)
json_content = content[(last(json_start) + 1):end]

# Parse the JSON
parsed = JSON.parse(json_content)
return parsed["docs"] # Return just the docs array
end

# Simple function that uses the existing search.js with real search data
function real_search(query::String)
# Load the real search index automatically
search_index_data = load_real_search_index()

# Read the JS wrapper and inject data
wrapper_js = read(joinpath(@__DIR__, "wrapper.js"), String)
wrapper_js = replace(wrapper_js, "__SEARCH_INDEX__" => JSON.json(search_index_data))
wrapper_js = replace(wrapper_js, "__QUERY__" => "\"" * query * "\"")


# Write the wrapper to a temporary file and run it
return mktemp(@__DIR__) do path, io
write(io, wrapper_js)
close(io)
cd(@__DIR__) do
# Install minisearch if it's not there
if !isdir("node_modules") || !isfile("node_modules/minisearch/package.json")
version = Documenter.HTMLWriter.MINISEARCH_VERSION
if version === nothing
error("Could not find minisearch version in search.js")
end
# We have to pass --prefix here, otherwise npm might try to install
# minisearch in a different location depending on the environment.
run(`$(NodeJS_22_jll.npm) --prefix . install minisearch@$(version)`)
end
result = read(`$(NodeJS_22_jll.node) $path`, String)
return JSON.parse(strip(result))
end
end
end
Loading
Loading