Skip to content

Always use return in formatter#278

Merged
JoshuaLampert merged 1 commit intomainfrom
always-use-return
Dec 20, 2025
Merged

Always use return in formatter#278
JoshuaLampert merged 1 commit intomainfrom
always-use-return

Conversation

@JoshuaLampert
Copy link
Copy Markdown
Member

This is the analogous PR as trixi-framework/Trixi.jl#2707. Here, we can enforce the return from JuliaFormatter.jl though because we do not use @kernels from KernelAbstractions.jl.

@github-actions
Copy link
Copy Markdown
Contributor

Benchmark Results (Julia v1.10)

Time benchmarks
main cc00b21... main / cc00b21...
bbm_1d/bbm_1d_basic.jl - rhs!: 13.8 ± 0.28 μs 13.8 ± 0.29 μs 1 ± 0.029
bbm_1d/bbm_1d_fourier.jl - rhs!: 0.535 ± 0.01 ms 0.224 ± 0.31 ms 2.39 ± 3.3
bbm_bbm_1d/bbm_bbm_1d_basic_reflecting.jl - rhs!: 0.0809 ± 0.00046 ms 0.0805 ± 0.0004 ms 1 ± 0.0076
bbm_bbm_1d/bbm_bbm_1d_dg.jl - rhs!: 0.0345 ± 0.00054 ms 0.034 ± 0.0007 ms 1.01 ± 0.026
bbm_bbm_1d/bbm_bbm_1d_relaxation.jl - rhs!: 27.4 ± 0.46 μs 27.5 ± 0.56 μs 0.995 ± 0.026
bbm_bbm_1d/bbm_bbm_1d_upwind_relaxation.jl - rhs!: 0.0483 ± 0.00058 ms 0.0487 ± 0.00069 ms 0.993 ± 0.018
hyperbolic_serre_green_naghdi_1d/hyperbolic_serre_green_naghdi_dingemans.jl - rhs!: 4.28 ± 0.06 μs 4.24 ± 0.03 μs 1.01 ± 0.016
kdv_1d/kdv_1d_basic.jl - rhs!: 1.44 ± 0.011 μs 1.46 ± 0.019 μs 0.986 ± 0.015
kdv_1d/kdv_1d_implicit.jl - rhs!: 1.39 ± 0.019 μs 1.41 ± 0.01 μs 0.986 ± 0.015
serre_green_naghdi_1d/serre_green_naghdi_well_balanced.jl - rhs!: 0.206 ± 0.0094 ms 0.202 ± 0.0095 ms 1.02 ± 0.067
svaerd_kalisch_1d/svaerd_kalisch_1d_dingemans_relaxation.jl - rhs!: 0.152 ± 0.0063 ms 0.148 ± 0.0049 ms 1.03 ± 0.055
time_to_load 2.04 ± 0.014 s 2.05 ± 0.0096 s 0.998 ± 0.0081
Memory benchmarks
main cc00b21... main / cc00b21...
bbm_1d/bbm_1d_basic.jl - rhs!: 1 allocs: 4.12 kB 1 allocs: 4.12 kB 1
bbm_1d/bbm_1d_fourier.jl - rhs!: 1 allocs: 4.12 kB 1 allocs: 4.12 kB 1
bbm_bbm_1d/bbm_bbm_1d_basic_reflecting.jl - rhs!: 5 allocs: 1.17 kB 5 allocs: 1.17 kB 1
bbm_bbm_1d/bbm_bbm_1d_dg.jl - rhs!: 10 allocs: 8.62 kB 10 allocs: 8.62 kB 1
bbm_bbm_1d/bbm_bbm_1d_relaxation.jl - rhs!: 2 allocs: 8.25 kB 2 allocs: 8.25 kB 1
bbm_bbm_1d/bbm_bbm_1d_upwind_relaxation.jl - rhs!: 2 allocs: 8.25 kB 2 allocs: 8.25 kB 1
hyperbolic_serre_green_naghdi_1d/hyperbolic_serre_green_naghdi_dingemans.jl - rhs!: 0 allocs: 0 B 0 allocs: 0 B
kdv_1d/kdv_1d_basic.jl - rhs!: 0 allocs: 0 B 0 allocs: 0 B
kdv_1d/kdv_1d_implicit.jl - rhs!: 0 allocs: 0 B 0 allocs: 0 B
serre_green_naghdi_1d/serre_green_naghdi_well_balanced.jl - rhs!: 0.075 k allocs: 0.66 MB 0.075 k allocs: 0.66 MB 1
svaerd_kalisch_1d/svaerd_kalisch_1d_dingemans_relaxation.jl - rhs!: 0.042 k allocs: 0.315 MB 0.042 k allocs: 0.315 MB 1
time_to_load 0.153 k allocs: 14.5 kB 0.153 k allocs: 14.5 kB 1

@codecov-commenter
Copy link
Copy Markdown

Codecov Report

❌ Patch coverage is 98.03922% with 1 line in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/util.jl 80.00% 1 Missing ⚠️

📢 Thoughts on this report? Let us know!

@coveralls
Copy link
Copy Markdown
Collaborator

Pull Request Test Coverage Report for Build 20376964420

Details

  • 50 of 51 (98.04%) changed or added relevant lines in 17 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall first build on always-use-return at 98.481%

Changes Missing Coverage Covered Lines Changed/Added Lines %
src/util.jl 4 5 80.0%
Totals Coverage Status
Change from base Build 20331931089: 98.5%
Covered Lines: 2334
Relevant Lines: 2370

💛 - Coveralls

Copy link
Copy Markdown
Member

@ranocha ranocha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@JoshuaLampert JoshuaLampert merged commit 62ae745 into main Dec 20, 2025
13 checks passed
@JoshuaLampert JoshuaLampert deleted the always-use-return branch December 20, 2025 09:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants