Skip to content

Commit 08d43b7

Browse files
other_data
1 parent 8f0036f commit 08d43b7

File tree

6 files changed

+100
-45
lines changed

6 files changed

+100
-45
lines changed

API_REFERENCE_FOR_REGRESSION.md

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# APLRRegressor
22

3-
## class aplr.APLRRegressor(m:int=1000, v:float=0.1, random_state:int=0, loss_function:str="mse", link_function:str="identity", n_jobs:int=0, validation_ratio:float=0.2, bins:int=300, max_interaction_level:int=1, max_interactions:int=100000, min_observations_in_split:int=20, ineligible_boosting_steps_added:int=10, max_eligible_terms:int=5, verbosity:int=0, dispersion_parameter:float=1.5, validation_tuning_metric:str="default", quantile:float=0.5, calculate_custom_validation_error_function:Optional[Callable[[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], float]]=None, calculate_custom_loss_function:Optional[Callable[[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], float]]=None, calculate_custom_negative_gradient_function:Optional[Callable[[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], npt.ArrayLike]]=None, calculate_custom_transform_linear_predictor_to_predictions_function:Optional[Callable[[npt.ArrayLike], npt.ArrayLike]]=None, calculate_custom_differentiate_predictions_wrt_linear_predictor_function:Optional[Callable[[npt.ArrayLike], npt.ArrayLike]]=None, boosting_steps_before_pruning_is_done: int = 0)
3+
## class aplr.APLRRegressor(m:int=1000, v:float=0.1, random_state:int=0, loss_function:str="mse", link_function:str="identity", n_jobs:int=0, validation_ratio:float=0.2, bins:int=300, max_interaction_level:int=1, max_interactions:int=100000, min_observations_in_split:int=20, ineligible_boosting_steps_added:int=10, max_eligible_terms:int=5, verbosity:int=0, dispersion_parameter:float=1.5, validation_tuning_metric:str="default", quantile:float=0.5, calculate_custom_validation_error_function:Optional[Callable[[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], float]]=None, calculate_custom_loss_function:Optional[Callable[[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], float]]=None, calculate_custom_negative_gradient_function:Optional[Callable[[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], npt.ArrayLike]]=None, calculate_custom_transform_linear_predictor_to_predictions_function:Optional[Callable[[npt.ArrayLike], npt.ArrayLike]]=None, calculate_custom_differentiate_predictions_wrt_linear_predictor_function:Optional[Callable[[npt.ArrayLike], npt.ArrayLike]]=None, boosting_steps_before_pruning_is_done: int = 0)
44

55
### Constructor parameters
66

@@ -59,7 +59,7 @@ Specifies the quantile to use when ***loss_function*** is "quantile".
5959
A Python function that calculates validation error if ***validation_tuning_metric*** is "custom_function". Example:
6060

6161
```
62-
def custom_validation_error_function(y, predictions, sample_weight, group):
62+
def custom_validation_error_function(y, predictions, sample_weight, group, other_data):
6363
squared_errors = (y-predictions)**2
6464
return squared_errors.mean()
6565
```
@@ -68,7 +68,7 @@ def custom_validation_error_function(y, predictions, sample_weight, group):
6868
A Python function that calculates loss if ***loss_function*** is "custom_function". Example:
6969

7070
```
71-
def custom_loss_function(y, predictions, sample_weight, group):
71+
def custom_loss_function(y, predictions, sample_weight, group, other_data):
7272
squared_errors = (y-predictions)**2
7373
return squared_errors.mean()
7474
```
@@ -77,7 +77,7 @@ def custom_loss_function(y, predictions, sample_weight, group):
7777
A Python function that calculates the negative gradient if ***loss_function*** is "custom_function". The negative gradient should be proportional to the negative of the first order differentiation of the custom loss function (***calculate_custom_loss_function***) with respect to the predictions. Example:
7878

7979
```
80-
def custom_negative_gradient_function(y, predictions, group):
80+
def custom_negative_gradient_function(y, predictions, group, other_data):
8181
residuals = y-predictions
8282
return residuals
8383
```
@@ -105,7 +105,7 @@ def calculate_custom_differentiate_predictions_wrt_linear_predictor(linear_predi
105105
#### boosting_steps_before_pruning_is_done (default = 0)
106106
Specifies how many boosting steps to wait before pruning the model. If 0 (default) then pruning is not done. If for example 500 then the model will be pruned in boosting steps 500, 1000, and so on. When pruning, terms are removed as long as this reduces the training error. This can be a computationally costly operation especially if the model gets many terms. Pruning may improve predictiveness.
107107

108-
## Method: fit(X:npt.ArrayLike, y:npt.ArrayLike, sample_weight:npt.ArrayLike = np.empty(0), X_names:List[str]=[], validation_set_indexes:List[int]=[], prioritized_predictors_indexes:List[int]=[], monotonic_constraints:List[int]=[], group:npt.ArrayLike = np.empty(0), interaction_constraints:List[List[int]]=[])
108+
## Method: fit(X:npt.ArrayLike, y:npt.ArrayLike, sample_weight:npt.ArrayLike = np.empty(0), X_names:List[str]=[], validation_set_indexes:List[int]=[], prioritized_predictors_indexes:List[int]=[], monotonic_constraints:List[int]=[], group:npt.ArrayLike = np.empty(0), interaction_constraints:List[List[int]]=[], other_data: npt.ArrayLike = np.empty([0, 0]))
109109

110110
***This method fits the model to data.***
111111

@@ -138,6 +138,9 @@ A numpy vector of integers that is used when ***loss_function*** is "group_mse".
138138
#### interaction_constraints
139139
An optional list containing lists of integers. Specifies interaction constraints on model terms. For example, interaction_constraints = [[0,1], [1,2,3]] means that 1) the first and second predictors may interact with each other, and that 2) the second, third and fourth predictors may interact with each other. There are no interaction constraints on predictors not mentioned in interaction_constraints.
140140

141+
#### other_data
142+
An optional numpy matrix with other data. This is used in custom loss, negative gradient and validation error functions.
143+
141144

142145
## Method: predict(X:npt.ArrayLike, cap_predictions_to_minmax_in_training:bool=True)
143146

aplr/aplr.py

Lines changed: 22 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,16 +26,33 @@ def __init__(
2626
quantile: float = 0.5,
2727
calculate_custom_validation_error_function: Optional[
2828
Callable[
29-
[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], float
29+
[
30+
npt.ArrayLike,
31+
npt.ArrayLike,
32+
npt.ArrayLike,
33+
npt.ArrayLike,
34+
npt.ArrayLike,
35+
],
36+
float,
3037
]
3138
] = None,
3239
calculate_custom_loss_function: Optional[
3340
Callable[
34-
[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], float
41+
[
42+
npt.ArrayLike,
43+
npt.ArrayLike,
44+
npt.ArrayLike,
45+
npt.ArrayLike,
46+
npt.ArrayLike,
47+
],
48+
float,
3549
]
3650
] = None,
3751
calculate_custom_negative_gradient_function: Optional[
38-
Callable[[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike], npt.ArrayLike]
52+
Callable[
53+
[npt.ArrayLike, npt.ArrayLike, npt.ArrayLike, npt.ArrayLike],
54+
npt.ArrayLike,
55+
]
3956
] = None,
4057
calculate_custom_transform_linear_predictor_to_predictions_function: Optional[
4158
Callable[[npt.ArrayLike], npt.ArrayLike]
@@ -134,6 +151,7 @@ def fit(
134151
monotonic_constraints: List[int] = [],
135152
group: npt.ArrayLike = np.empty(0),
136153
interaction_constraints: List[List[int]] = [],
154+
other_data: npt.ArrayLike = np.empty([0, 0]),
137155
):
138156
self.__set_params_cpp()
139157
self.APLRRegressor.fit(
@@ -146,6 +164,7 @@ def fit(
146164
monotonic_constraints,
147165
group,
148166
interaction_constraints,
167+
other_data,
149168
)
150169

151170
def predict(

0 commit comments

Comments
 (0)