Skip to content

Commit 9e91a17

Browse files
stevengjjrevels
authored andcommitted
add quick start to README (#82)
* add quick start to README The vast majority of users probably just want `@btime`, so I think it is useful to include a "quick start" section in the manual that just gives a couple of examples (including interpolation). * global -> external, add links * move quick start after documentation section
1 parent f820b8b commit 9e91a17

File tree

1 file changed

+29
-0
lines changed

1 file changed

+29
-0
lines changed

README.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,35 @@ If you want an extensive example of a benchmark suite being used in the real wor
2929

3030
If you're benchmarking on Linux, I wrote up a series of [tips and tricks](https://github.com/JuliaCI/BenchmarkTools.jl/blob/master/doc/linuxtips.md) to help eliminate noise during performance tests.
3131

32+
## Quick Start
33+
34+
The simplest usage is via the [`@btime` macro](https://github.com/JuliaCI/BenchmarkTools.jl/blob/master/doc/manual.md#benchmarking-basics), which is analogous to Julia's built-in [`@time` macro](https://docs.julialang.org/en/stable/stdlib/base/#Base.@time) but is often more accurate (by collecting results over multiple runs):
35+
36+
```julia
37+
julia> using BenchmarkTools, Compat # you need to use both modules
38+
39+
julia> @btime sin(1)
40+
15.081 ns (0 allocations: 0 bytes)
41+
0.8414709848078965
42+
```
43+
44+
If the expression you want to benchmark depends on external variables, you should use [`$` to "interpolate"](https://github.com/JuliaCI/BenchmarkTools.jl/blob/master/doc/manual.md#interpolating-values-into-benchmark-expressions) them into the benchmark expression to [avoid the problems of benchmarking with globals](https://docs.julialang.org/en/stable/manual/performance-tips/#Avoid-global-variables-1). Essentially, any interpolated variable `$x` or expression `$(...)` is "pre-computed" before benchmarking begins:
45+
46+
```julia
47+
julia> A = rand(3,3);
48+
49+
julia> @btime inv($A); # we interpolate the global variable A with $A
50+
1.191 μs (10 allocations: 2.31 KiB)
51+
52+
julia> @btime inv($(rand(3,3))); # interpolation: the rand(3,3) call occurs before benchmarking
53+
1.192 μs (10 allocations: 2.31 KiB)
54+
55+
julia> @btime inv(rand(3,3)); # the rand(3,3) call is included in the benchmark time
56+
1.295 μs (11 allocations: 2.47 KiB)
57+
```
58+
59+
As described the [manual](doc/manual.md), the BenchmarkTools package supports many other features, both for additional output and for more fine-grained control over the benchmarking process.
60+
3261
## Why does this package exist?
3362

3463
Our story begins with two packages, "Benchmarks" and "BenchmarkTrackers". The Benchmarks package implemented an execution strategy for collecting and summarizing individual benchmark results, while BenchmarkTrackers implemented a framework for organizing, running, and determining regressions of groups of benchmarks. Under the hood, BenchmarkTrackers relied on Benchmarks for actual benchmark execution.

0 commit comments

Comments
 (0)