Skip to content

Commit 2d8c7a4

Browse files
committed
Changed http to https in documentation links
1 parent 52c1295 commit 2d8c7a4

File tree

7 files changed

+62
-27
lines changed

7 files changed

+62
-27
lines changed

R/bayesboot.R

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -71,12 +71,12 @@ is.wholenumber <- function(x, tol = .Machine$double.eps^0.5) {
7171
#'
7272
#' For more information regarding this implementation of the Bayesian bootstrap
7373
#' see the blog post
74-
#' \href{http://www.sumsar.net/blog/2015/07/easy-bayesian-bootstrap-in-r/}{Easy
74+
#' \href{https://www.sumsar.net/blog/2015/07/easy-bayesian-bootstrap-in-r/}{Easy
7575
#' Bayesian Bootstrap in R}. For more information about the model behind the
7676
#' Bayesian bootstrap see the blog post
77-
#' \href{http://www.sumsar.net/blog/2015/04/the-non-parametric-bootstrap-as-a-bayesian-model/}{The
77+
#' \href{https://www.sumsar.net/blog/2015/04/the-non-parametric-bootstrap-as-a-bayesian-model/}{The
7878
#' Non-parametric Bootstrap as a Bayesian Model} and, of course,
79-
#' \href{http://projecteuclid.org/euclid.aos/1176345338}{the original Bayesian
79+
#' \href{https://projecteuclid.org/euclid.aos/1176345338}{the original Bayesian
8080
#' bootstrap paper by Rubin (1981)}.
8181
#'
8282
#' @note \itemize{

README.Rmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ lines(cars$speed, colMeans(bb_loess, na.rm = TRUE), type ="l",
113113
More information
114114
-----------------------
115115

116-
For more information on the Bayesian bootstrap see [Rubin's (1981) original paper](https://projecteuclid.org/euclid.aos/1176345338) and my blog post [The Non-parametric Bootstrap as a Bayesian Model](http://sumsar.net/blog/2015/04/the-non-parametric-bootstrap-as-a-bayesian-model/). The implementation of `bayesboot` is similar to the function outlined in the blog post [Easy Bayesian Bootstrap in R](http://sumsar.net/blog/2015/07/easy-bayesian-bootstrap-in-r/), but the interface is slightly different.
116+
For more information on the Bayesian bootstrap see [Rubin's (1981) original paper](https://projecteuclid.org/euclid.aos/1176345338) and my blog post [The Non-parametric Bootstrap as a Bayesian Model](https://www.sumsar.net/blog/2015/04/the-non-parametric-bootstrap-as-a-bayesian-model/). The implementation of `bayesboot` is similar to the function outlined in the blog post [Easy Bayesian Bootstrap in R](https://www.sumsar.net/blog/2015/07/easy-bayesian-bootstrap-in-r/), but the interface is slightly different.
117117

118118
References
119119
----------------

README.md

Lines changed: 55 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,20 @@
1-
`bayesboot`: Easy Bayesian Bootstrap in R
2-
=========================================
1+
# `bayesboot`: Easy Bayesian Bootstrap in R
32

4-
The `bayesboot` package implements a function `bayesboot` that performs the Bayesian bootstrap introduced by Rubin (1981). The implementation can both handle summary statistics that works on a weighted version of the data or that works on a resampled data set.
3+
The `bayesboot` package implements a function `bayesboot` that performs
4+
the Bayesian bootstrap introduced by Rubin (1981). The implementation
5+
can both handle summary statistics that works on a weighted version of
6+
the data or that works on a resampled data set.
57

68
`bayesboot` is available on CRAN and can be installed in the usual way:
79

810
``` r
911
install.packages("bayesboot")
1012
```
1113

12-
A simple example
13-
----------------
14+
## A simple example
1415

15-
Here is a Bayesian bootstrap analysis of the mean height of the last ten American presidents:
16+
Here is a Bayesian bootstrap analysis of the mean height of the last ten
17+
American presidents:
1618

1719
``` r
1820
# Heights of the last ten American presidents in cm (Kennedy to Obama).
@@ -22,7 +24,8 @@ library(bayesboot)
2224
b1 <- bayesboot(heights, mean)
2325
```
2426

25-
The resulting posterior distribution in `b1` can now be `plot`ted and `summary`ized:
27+
The resulting posterior distribution in `b1` can now be `plot`ted and
28+
`summary`ized:
2629

2730
``` r
2831
summary(b1)
@@ -45,13 +48,21 @@ plot(b1)
4548

4649
![](man/figures/README-president_summary-1.png)
4750

48-
While it is possible to use a summary statistic that works on a resample of the original data, it is more efficient if it's possible to use a summary statistic that works on a *reweighting* of the original dataset. Instead of using `mean` above it would be better to use `weighted.mean` like this:
51+
While it is possible to use a summary statistic that works on a resample
52+
of the original data, it is more efficient if it’s possible to use a
53+
summary statistic that works on a *reweighting* of the original dataset.
54+
Instead of using `mean` above it would be better to use `weighted.mean`
55+
like this:
4956

5057
``` r
5158
b2 <- bayesboot(heights, weighted.mean, use.weights = TRUE)
5259
```
5360

54-
The result of a call to `bayesboot` will always result in a `data.frame` with one column per dimension of the summary statistic. If the summary statistic does not return a named vector the columns will be called `V1`, `V2`, etc. The result of a `bayesboot` call can be further inspected and post processed. For example:
61+
The result of a call to `bayesboot` will always result in a `data.frame`
62+
with one column per dimension of the summary statistic. If the summary
63+
statistic does not return a named vector the columns will be called
64+
`V1`, `V2`, etc. The result of a `bayesboot` call can be further
65+
inspected and post processed. For example:
5566

5667
``` r
5768
# Given the model and the data, this is the probability that the mean
@@ -63,7 +74,13 @@ mean(c(b2[,1] > 175.9, TRUE, FALSE))
6374

6475
### Comparing two groups
6576

66-
If we want to compare the means of two groups, we will have to call `bayesboot` twice with each dataset and then use the resulting samples to calculate the posterior difference. For example, let's say we have the heights of the opponents that lost to the presidents in `height` the first time those presidents were elected. Now we are interested in comparing the mean height of American presidents with the mean height of presidential candidates that lost.
77+
If we want to compare the means of two groups, we will have to call
78+
`bayesboot` twice with each dataset and then use the resulting samples
79+
to calculate the posterior difference. For example, let’s say we have
80+
the heights of the opponents that lost to the presidents in `height` the
81+
first time those presidents were elected. Now we are interested in
82+
comparing the mean height of American presidents with the mean height of
83+
presidential candidates that lost.
6784

6885
``` r
6986
# The heights of oponents of American presidents (first time they were elected).
@@ -82,12 +99,23 @@ plot(b_diff)
8299

83100
![](man/figures/README-height_comparison-1.png)
84101

85-
So there is some evidence that loosing opponents could be shorter. (Though, I must add that it is quite unclear what the purpose really is with analyzing the heights of presidents and opponents...)
102+
So there is some evidence that loosing opponents could be shorter.
103+
(Though, I must add that it is quite unclear what the purpose really is
104+
with analyzing the heights of presidents and opponents…)
86105

87-
A more advanced example
88-
-----------------------
106+
## A more advanced example
89107

90-
A slightly more complicated example, where we do Bayesian bootstrap analysis of LOESS regression applied to the `cars` dataset on the speed of cars and the resulting distance it takes to stop. The `loess` function returns, among other things, a vector of `fitted` *y* values, one value for each *x* value in the data. These *y* values define the smoothed LOESS line and is what you would usually plot after having fitted a LOESS. Now we want to use the Bayesian bootstrap to gauge the uncertainty in the LOESS line. As the `loess` function accepts weighted data, we'll simply create a function that takes the data with weights and returns the `fitted` *y* values. We'll then plug that function into `bayesboot`:
108+
A slightly more complicated example, where we do Bayesian bootstrap
109+
analysis of LOESS regression applied to the `cars` dataset on the speed
110+
of cars and the resulting distance it takes to stop. The `loess`
111+
function returns, among other things, a vector of `fitted` *y* values,
112+
one value for each *x* value in the data. These *y* values define the
113+
smoothed LOESS line and is what you would usually plot after having
114+
fitted a LOESS. Now we want to use the Bayesian bootstrap to gauge the
115+
uncertainty in the LOESS line. As the `loess` function accepts weighted
116+
data, we’ll simply create a function that takes the data with weights
117+
and returns the `fitted` *y* values. We’ll then plug that function into
118+
`bayesboot`:
91119

92120
``` r
93121
boot_fn <- function(cars, weights) {
@@ -115,12 +143,19 @@ lines(cars$speed, colMeans(bb_loess, na.rm = TRUE), type ="l",
115143

116144
![](man/figures/README-car_plot-1.png)
117145

118-
More information
119-
----------------
146+
## More information
120147

121-
For more information on the Bayesian bootstrap see [Rubin's (1981) original paper](https://projecteuclid.org/euclid.aos/1176345338) and my blog post [The Non-parametric Bootstrap as a Bayesian Model](http://sumsar.net/blog/2015/04/the-non-parametric-bootstrap-as-a-bayesian-model/). The implementation of `bayesboot` is similar to the function outlined in the blog post [Easy Bayesian Bootstrap in R](http://sumsar.net/blog/2015/07/easy-bayesian-bootstrap-in-r/), but the interface is slightly different.
148+
For more information on the Bayesian bootstrap see [Rubin’s (1981)
149+
original paper](https://projecteuclid.org/euclid.aos/1176345338) and my
150+
blog post [The Non-parametric Bootstrap as a Bayesian
151+
Model](https://www.sumsar.net/blog/2015/04/the-non-parametric-bootstrap-as-a-bayesian-model/).
152+
The implementation of `bayesboot` is similar to the function outlined in
153+
the blog post [Easy Bayesian Bootstrap in
154+
R](https://www.sumsar.net/blog/2015/07/easy-bayesian-bootstrap-in-r/),
155+
but the interface is slightly different.
122156

123-
References
124-
----------
157+
## References
125158

126-
Rubin, D. B. (1981). The Bayesian bootstrap. *The annals of statistics*, 9(1), 130--134. [link to paper](https://projecteuclid.org/euclid.aos/1176345338)
159+
Rubin, D. B. (1981). The Bayesian bootstrap. *The annals of statistics*,
160+
9(1), 130–134. [link to
161+
paper](https://projecteuclid.org/euclid.aos/1176345338)

man/bayesboot.Rd

Lines changed: 3 additions & 3 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/figures/README-car_plot-1.png

-147 Bytes
Loading
95 Bytes
Loading
63 Bytes
Loading

0 commit comments

Comments
 (0)