Skip to content

Commit 92f29e1

Browse files
authored
[doc] Use https link for R document check. (dmlc#11257)
1 parent 0a3d318 commit 92f29e1

File tree

6 files changed

+33
-33
lines changed

6 files changed

+33
-33
lines changed

R-package/R/xgb.train.R

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
#' See also the [migration guide](https://xgboost.readthedocs.io/en/latest/R-package/migration_guide.html)
1919
#' if coming from a previous version of XGBoost in the 1.x series.
2020
#' @param params List of XGBoost parameters which control the model building process.
21-
#' See the [online documentation](http://xgboost.readthedocs.io/en/latest/parameter.html)
21+
#' See the [online documentation](https://xgboost.readthedocs.io/en/latest/parameter.html)
2222
#' and the documentation for [xgb.params()] for details.
2323
#'
2424
#' Should be passed as list with named entries. Parameters that are not specified in this
@@ -458,8 +458,8 @@ xgb.train <- function(params = xgb.params(), data, nrounds, evals = list(),
458458
#' See [Survival Analysis with Accelerated Failure Time](https://xgboost.readthedocs.io/en/latest/tutorials/aft_survival_analysis.html) for details.
459459
#' - `"multi:softmax"`: set XGBoost to do multiclass classification using the softmax objective, you also need to set num_class(number of classes)
460460
#' - `"multi:softprob"`: same as softmax, but output a vector of `ndata * nclass`, which can be further reshaped to `ndata * nclass` matrix. The result contains predicted probability of each data point belonging to each class.
461-
#' - `"rank:ndcg"`: Use LambdaMART to perform pair-wise ranking where [Normalized Discounted Cumulative Gain (NDCG)](http://en.wikipedia.org/wiki/NDCG) is maximized. This objective supports position debiasing for click data.
462-
#' - `"rank:map"`: Use LambdaMART to perform pair-wise ranking where [Mean Average Precision (MAP)](http://en.wikipedia.org/wiki/Mean_average_precision#Mean_average_precision) is maximized
461+
#' - `"rank:ndcg"`: Use LambdaMART to perform pair-wise ranking where [Normalized Discounted Cumulative Gain (NDCG)](https://en.wikipedia.org/wiki/NDCG) is maximized. This objective supports position debiasing for click data.
462+
#' - `"rank:map"`: Use LambdaMART to perform pair-wise ranking where [Mean Average Precision (MAP)](https://en.wikipedia.org/wiki/Mean_average_precision#Mean_average_precision) is maximized
463463
#' - `"rank:pairwise"`: Use LambdaRank to perform pair-wise ranking using the `ranknet` objective.
464464
#' - `"reg:gamma"`: gamma regression with log-link. Output is a mean of gamma distribution. It might be useful, e.g., for modeling insurance claims severity, or for any outcome that might be [gamma-distributed](https://en.wikipedia.org/wiki/Gamma_distribution#Occurrence_and_applications).
465465
#' - `"reg:tweedie"`: Tweedie regression with log-link. It might be useful, e.g., for modeling total loss in insurance, or for any outcome that might be [Tweedie-distributed](https://en.wikipedia.org/wiki/Tweedie_distribution#Occurrence_and_applications).
@@ -543,7 +543,7 @@ xgb.train <- function(params = xgb.params(), data, nrounds, evals = list(),
543543
#'
544544
#' Note: should only pass one of `alpha` or `reg_alpha`. Both refer to the same parameter and there's thus no difference between one or the other.
545545
#' @param tree_method (for Tree Booster) (default= `"auto"`)
546-
#' The tree construction algorithm used in XGBoost. See description in the [reference paper](http://arxiv.org/abs/1603.02754) and [Tree Methods](https://xgboost.readthedocs.io/en/latest/treemethod.html).
546+
#' The tree construction algorithm used in XGBoost. See description in the [reference paper](https://arxiv.org/abs/1603.02754) and [Tree Methods](https://xgboost.readthedocs.io/en/latest/treemethod.html).
547547
#'
548548
#' Choices: `"auto"`, `"exact"`, `"approx"`, `"hist"`, this is a combination of commonly
549549
#' used updaters. For other updaters like `"refresh"`, set the parameter `updater`
@@ -613,16 +613,16 @@ xgb.train <- function(params = xgb.params(), data, nrounds, evals = list(),
613613
#' - Evaluation metrics for validation data, a default metric will be assigned according to objective (rmse for regression, and logloss for classification, `mean average precision` for ``rank:map``, etc.)
614614
#' - User can add multiple evaluation metrics.
615615
#' - The choices are listed below:
616-
#' - `"rmse"`: [root mean square error](http://en.wikipedia.org/wiki/Root_mean_square_error)
616+
#' - `"rmse"`: [root mean square error](https://en.wikipedia.org/wiki/Root_mean_square_error)
617617
#' - `"rmsle"`: root mean square log error: \eqn{\sqrt{\frac{1}{N}[log(pred + 1) - log(label + 1)]^2}}. Default metric of `"reg:squaredlogerror"` objective. This metric reduces errors generated by outliers in dataset. But because `log` function is employed, `"rmsle"` might output `nan` when prediction value is less than -1. See `"reg:squaredlogerror"` for other requirements.
618618
#' - `"mae"`: [mean absolute error](https://en.wikipedia.org/wiki/Mean_absolute_error)
619619
#' - `"mape"`: [mean absolute percentage error](https://en.wikipedia.org/wiki/Mean_absolute_percentage_error)
620620
#' - `"mphe"`: [mean Pseudo Huber error](https://en.wikipedia.org/wiki/Huber_loss). Default metric of `"reg:pseudohubererror"` objective.
621-
#' - `"logloss"`: [negative log-likelihood](http://en.wikipedia.org/wiki/Log-likelihood)
621+
#' - `"logloss"`: [negative log-likelihood](https://en.wikipedia.org/wiki/Log-likelihood)
622622
#' - `"error"`: Binary classification error rate. It is calculated as `#(wrong cases)/#(all cases)`. For the predictions, the evaluation will regard the instances with prediction value larger than 0.5 as positive instances, and the others as negative instances.
623623
#' - `"error@t"`: a different than 0.5 binary classification threshold value could be specified by providing a numerical value through 't'.
624624
#' - `"merror"`: Multiclass classification error rate. It is calculated as `#(wrong cases)/#(all cases)`.
625-
#' - `"mlogloss"`: [Multiclass logloss](http://scikit-learn.org/stable/modules/generated/sklearn.metrics.log_loss.html).
625+
#' - `"mlogloss"`: [Multiclass logloss](https://scikit-learn.org/stable/modules/generated/sklearn.metrics.log_loss.html).
626626
#' - `"auc"`: [Receiver Operating Characteristic Area under the Curve](https://en.wikipedia.org/wiki/Receiver_operating_characteristic#Area_under_the_curve).
627627
#' Available for classification and learning-to-rank tasks.
628628
#' - When used with binary classification, the objective should be `"binary:logistic"` or similar functions that work on probability.
@@ -636,8 +636,8 @@ xgb.train <- function(params = xgb.params(), data, nrounds, evals = list(),
636636
#' After XGBoost 1.6, both of the requirements and restrictions for using `"aucpr"` in classification problem are similar to `"auc"`. For ranking task, only binary relevance label \eqn{y \in [0, 1]} is supported. Different from `"map"` (mean average precision), `"aucpr"` calculates the *interpolated* area under precision recall curve using continuous interpolation.
637637
#'
638638
#' - `"pre"`: Precision at \eqn{k}. Supports only learning to rank task.
639-
#' - `"ndcg"`: [Normalized Discounted Cumulative Gain](http://en.wikipedia.org/wiki/NDCG)
640-
#' - `"map"`: [Mean Average Precision](http://en.wikipedia.org/wiki/Mean_average_precision#Mean_average_precision)
639+
#' - `"ndcg"`: [Normalized Discounted Cumulative Gain](https://en.wikipedia.org/wiki/NDCG)
640+
#' - `"map"`: [Mean Average Precision](https://en.wikipedia.org/wiki/Mean_average_precision#Mean_average_precision)
641641
#'
642642
#' The `average precision` is defined as:
643643
#'

R-package/man/xgb.cv.Rd

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

R-package/man/xgb.params.Rd

Lines changed: 8 additions & 8 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

R-package/man/xgb.train.Rd

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

R-package/man/xgboost.Rd

Lines changed: 6 additions & 6 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)