@@ -17,22 +17,24 @@ Overview
1717 - ReHLine Minimization.
1818 * - :py:obj: `plqERM_Ridge <rehline.plqERM_Ridge> `
1919 - Empirical Risk Minimization (ERM) with a piecewise linear-quadratic (PLQ) objective with a ridge penalty.
20+ * - :py:obj: `CQR_Ridge <rehline.CQR_Ridge> `
21+ - Composite Quantile Regressor (CQR) with a ridge penalty.
2022
2123
2224.. list-table :: Function
2325 :header-rows: 0
2426 :widths: auto
2527 :class: summarytable
2628
27- * - :py:obj: `ReHLine_solver <rehline.ReHLine_solver> `\ (X, U, V, Tau, S, T, A, b, max_iter, tol, shrink, verbose, trace_freq)
29+ * - :py:obj: `ReHLine_solver <rehline.ReHLine_solver> `\ (X, U, V, Tau, S, T, A, b, Lambda, Gamma, xi, max_iter, tol, shrink, verbose, trace_freq)
2830 - \-
2931
3032
3133
3234Classes
3335-------
3436
35- .. py :class :: ReHLine(C = 1.0 , U = np.empty(shape = (0 , 0 )), V = np.empty(shape = (0 , 0 )), Tau = np.empty(shape = (0 , 0 )), S = np.empty(shape = (0 , 0 )), T = np.empty(shape = (0 , 0 )), A = np.empty(shape = (0 , 0 )), b = np.empty(shape = 0 ), max_iter = 1000 , tol = 0.0001 , shrink = 1 , verbose = 0 , trace_freq = 100 )
37+ .. py :class :: ReHLine(C = 1.0 , U = np.empty(shape = (0 , 0 )), V = np.empty(shape = (0 , 0 )), Tau = np.empty(shape = (0 , 0 )), S = np.empty(shape = (0 , 0 )), T = np.empty(shape = (0 , 0 )), A = np.empty(shape = (0 , 0 )), b = np.empty(shape = 0 ), max_iter = 1000 , tol = 0.0001 , shrink = 1 , warm_start = 0 , verbose = 0 , trace_freq = 100 )
3638
3739 Bases: :py:obj: `rehline._base._BaseReHLine `, :py:obj: `sklearn.base.BaseEstimator `
3840
@@ -66,13 +68,24 @@ Classes
6668 The intercept vector in the linear constraint.
6769
6870 verbose : int, default=0
69- Enable verbose output. Note that this setting takes advantage of a
70- per-process runtime setting in liblinear that, if enabled, may not work
71- properly in a multithreaded context.
71+ Enable verbose output.
7272
7373 max_iter : int, default=1000
7474 The maximum number of iterations to be run.
7575
76+ tol : float, default=1e-4
77+ The tolerance for the stopping criterion.
78+
79+ shrink : float, default=1
80+ The shrinkage of dual variables for the ReHLine algorithm.
81+
82+ warm_start : bool, default=False
83+ Whether to use the given dual params as an initial guess for the
84+ optimization algorithm.
85+
86+ trace_freq : int, default=100
87+ The frequency at which to print the optimization trace.
88+
7689 Attributes
7790 ----------
7891 coef\_ : array-like
@@ -90,6 +103,15 @@ Classes
90103 primal_obj\_ : array-like
91104 The primal objective function values.
92105
106+ Lambda: array-like
107+ The optimized dual variables for ReLU parts.
108+
109+ Gamma: array-like
110+ The optimized dual variables for ReHU parts.
111+
112+ xi: array-like
113+ The optimized dual variables for linear constraints.
114+
93115 Examples
94116 --------
95117
@@ -178,7 +200,7 @@ Classes
178200
179201
180202
181- .. py :class :: plqERM_Ridge(loss, constraint = [], C = 1.0 , U = np.empty(shape = (0 , 0 )), V = np.empty(shape = (0 , 0 )), Tau = np.empty(shape = (0 , 0 )), S = np.empty(shape = (0 , 0 )), T = np.empty(shape = (0 , 0 )), A = np.empty(shape = (0 , 0 )), b = np.empty(shape = 0 ), max_iter = 1000 , tol = 0.0001 , shrink = 1 , verbose = 0 , trace_freq = 100 )
203+ .. py :class :: plqERM_Ridge(loss, constraint = [], C = 1.0 , U = np.empty(shape = (0 , 0 )), V = np.empty(shape = (0 , 0 )), Tau = np.empty(shape = (0 , 0 )), S = np.empty(shape = (0 , 0 )), T = np.empty(shape = (0 , 0 )), A = np.empty(shape = (0 , 0 )), b = np.empty(shape = 0 ), max_iter = 1000 , tol = 0.0001 , shrink = 1 , warm_start = 0 , verbose = 0 , trace_freq = 100 )
182204
183205 Bases: :py:obj: `rehline._base._BaseReHLine `, :py:obj: `sklearn.base.BaseEstimator `
184206
@@ -328,9 +350,146 @@ Classes
328350
329351
330352
353+ .. py :class :: CQR_Ridge(quantiles, C = 1.0 , max_iter = 1000 , tol = 0.0001 , shrink = 1 , warm_start = 0 , verbose = 0 , trace_freq = 100 )
354+
355+ Bases: :py:obj: `rehline._base._BaseReHLine `, :py:obj: `sklearn.base.BaseEstimator `
356+
357+ Composite Quantile Regressor (CQR) with a ridge penalty.
358+
359+ It allows for the fitting of a linear regression model that minimizes a composite quantile loss function.
360+
361+ .. math ::
362+
363+ \min _{\mathbf {\beta } \in \mathbb {R}^d, \mathbf {\beta _0 } \in \mathbb {R}^K} \sum _{k=1 }^K \sum _{i=1 }^n \text {PLQ}(y_i, \mathbf {x}_i^T \mathbf {\beta } + \mathbf {\beta _0 k}) + \frac {1 }{2 } \| \mathbf {\beta } \| _2 ^2 .
364+
365+
366+ Parameters
367+ ----------
368+ quantiles : list of float (n_quantiles,)
369+ The quantiles to be estimated.
370+
371+ C : float, default=1.0
372+ Regularization parameter. The strength of the regularization is
373+ inversely proportional to C. Must be strictly positive.
374+ `C ` will be absorbed by the ReHLine parameters when `self.make_ReLHLoss ` is conducted.
375+
376+ verbose : int, default=0
377+ Enable verbose output. Note that this setting takes advantage of a
378+ per-process runtime setting in liblinear that, if enabled, may not work
379+ properly in a multithreaded context.
380+
381+ max_iter : int, default=1000
382+ The maximum number of iterations to be run.
383+
384+ tol : float, default=1e-4
385+ The tolerance for the stopping criterion.
386+
387+ shrink : float, default=1
388+ The shrinkage of dual variables for the ReHLine algorithm.
389+
390+ warm_start : bool, default=False
391+ Whether to use the given dual params as an initial guess for the
392+ optimization algorithm.
393+
394+ trace_freq : int, default=100
395+ The frequency at which to print the optimization trace.
396+
397+ Attributes
398+ ----------
399+ coef\_ : array-like
400+ The optimized model coefficients.
401+
402+ intercept\_ : array-like
403+ The optimized model intercepts.
404+
405+ quantiles\_ : array-like
406+ The quantiles to be estimated.
407+
408+ n_iter\_ : int
409+ The number of iterations performed by the ReHLine solver.
410+
411+ opt_result\_ : object
412+ The optimization result object.
413+
414+ dual_obj\_ : array-like
415+ The dual objective function values.
416+
417+ primal_obj\_ : array-like
418+ The primal objective function values.
419+
420+ Methods
421+ -------
422+ fit(X, y, sample_weight=None)
423+ Fit the model based on the given training data.
424+
425+ predict(X)
426+ The prediction for the given dataset.
427+
428+
429+ Overview
430+ ========
431+
432+
433+ .. list-table :: Methods
434+ :header-rows: 0
435+ :widths: auto
436+ :class: summarytable
437+
438+ * - :py:obj: `fit <rehline.CQR_Ridge.fit> `\ (X, y, sample_weight)
439+ - Fit the model based on the given training data.
440+ * - :py:obj: `predict <rehline.CQR_Ridge.predict> `\ (X)
441+ - The decision function evaluated on the given dataset
442+
443+
444+ Members
445+ =======
446+
447+ .. py :method :: fit(X, y, sample_weight = None )
448+
449+ Fit the model based on the given training data.
450+
451+ Parameters
452+ ----------
453+
454+ X: {array-like} of shape (n_samples, n_features)
455+ Training vector, where `n_samples ` is the number of samples and
456+ `n_features ` is the number of features.
457+
458+ y : array-like of shape (n_samples,)
459+ The target variable.
460+
461+ sample_weight : array-like of shape (n_samples,), default=None
462+ Array of weights that are assigned to individual
463+ samples. If not provided, then each sample is given unit weight.
464+
465+ Returns
466+ -------
467+ self : object
468+ An instance of the estimator.
469+
470+
471+
472+
473+ .. py :method :: predict(X)
474+
475+ The decision function evaluated on the given dataset
476+
477+ Parameters
478+ ----------
479+ X : array-like of shape (n_samples, n_features)
480+ The data matrix.
481+
482+ Returns
483+ -------
484+ ndarray of shape (n_samples, n_quantiles)
485+ Returns the decision function of the samples.
486+
487+
488+
489+
331490Functions
332491---------
333- .. py :function :: ReHLine_solver(X, U, V, Tau = np.empty(shape = (0 , 0 )), S = np.empty(shape = (0 , 0 )), T = np.empty(shape = (0 , 0 )), A = np.empty(shape = (0 , 0 )), b = np.empty(shape = 0 ), max_iter = 1000 , tol = 0.0001 , shrink = 1 , verbose = 1 , trace_freq = 100 )
492+ .. py :function :: ReHLine_solver(X, U, V, Tau = np.empty(shape = (0 , 0 )), S = np.empty(shape = (0 , 0 )), T = np.empty(shape = (0 , 0 )), A = np.empty(shape = (0 , 0 )), b = np.empty(shape = 0 ), Lambda = np.empty( shape = ( 0 , 0 )), Gamma = np.empty( shape = ( 0 , 0 )), xi = np.empty( shape = ( 0 , 0 )), max_iter = 1000 , tol = 0.0001 , shrink = 1 , verbose = 1 , trace_freq = 100 )
334493
335494
336495
0 commit comments