You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/getting_started.rst
+8-5Lines changed: 8 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,8 @@
1
1
.. _getting_started:
2
2
3
+
.. meta::
4
+
:description: Learn how to fit Lasso and custom GLM estimators with skglm, a modular Python library compatible with scikit-learn. Includes examples and code snippets.
5
+
3
6
===============
4
7
Getting started
5
8
===============
@@ -31,7 +34,7 @@ Fitting a Lasso estimator
31
34
-------------------------
32
35
33
36
Let's start first by generating a toy dataset and splitting it to train and test sets.
*— A fast and modular scikit-learn replacement for regularized GLMs —*
16
+
*— Fast and Flexible Generalized Linear Models for Python —*
10
17
11
18
--------
12
19
13
20
14
21
``skglm`` is a Python package that offers **fast estimators** for regularized Generalized Linear Models (GLMs)
15
-
that are **100% compatible with** ``scikit-learn``. It is **highly flexible** and supports a wide range of GLMs.
16
-
You get to choose from ``skglm``'s already-made estimators or **customize your own** by combining the available datafits and penalties.
22
+
that are **100% compatible with** ``scikit-learn``. It is **highly flexible** and supports a wide range of GLMs,
23
+
designed to tackle high-dimensional data and scalable machine learning problems.
24
+
Whether you choose from our ready-made estimators or **customize your own** using a modular combination of datafits and penalties,
25
+
skglm delivers performance and flexibility for both academic research and production environments.
17
26
18
27
Get a hands-on glimpse on ``skglm`` through the :ref:`Getting started page <getting_started>`.
19
28
@@ -79,6 +88,22 @@ It is also available on conda-forge and can be installed using, for instance:
79
88
$ conda install -c conda-forge skglm
80
89
81
90
With ``skglm`` being installed, Get the first steps with the package via the :ref:`Getting started section <getting_started>`.
91
+
92
+
Applications and Use Cases
93
+
---------------------------
94
+
95
+
``skglm`` drives impactful solutions across diverse sectors with its fast, modular approach to regularized GLMs and sparse modeling. Some examples include:
96
+
97
+
.. list-table::
98
+
:widths: 20 80
99
+
100
+
* - **Healthcare:**
101
+
- Enhance clinical trial analytics and early biomarker discovery by efficiently analyzing high-dimensional biological data and features like cox regression modeling.
102
+
* - **Finance:**
103
+
- Conduct transparent and interpretable risk modeling with scalable, robust sparse regression across vast datasets.
104
+
* - **Energy:**
105
+
- Optimize real-time electricity forecasting and load analysis by processing large time-series datasets for predictive maintenance and anomaly detection.
106
+
82
107
Other advanced topics and uses-cases are covered in :ref:`Tutorials <tutorials>`.
83
108
84
109
.. it is mandatory to keep the toctree here although it doesn't show up in the page
Copy file name to clipboardExpand all lines: doc/tutorials/add_datafit.rst
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,8 @@
1
1
:orphan:
2
2
3
3
.. _how_to_add_custom_datafit:
4
+
.. meta::
5
+
:description: Tutorial on creating and implementing a custom datafit in skglm. Step-by-step guide includes deriving gradients, Hessians, and an example with Poisson datafit.
Copy file name to clipboardExpand all lines: doc/tutorials/add_penalty.rst
+3Lines changed: 3 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,9 @@
2
2
3
3
.. _how_to_add_custom_penalty:
4
4
5
+
.. meta::
6
+
:description: Step-by-step tutorial on adding custom penalties in skglm. Covers implementation details, proximal operators, and optimality conditions using the L1 penalty.
Copy file name to clipboardExpand all lines: doc/tutorials/alpha_max.rst
+3Lines changed: 3 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,8 @@
1
1
.. _alpha_max:
2
2
3
+
.. meta::
4
+
:description: Tutorial explaining the critical regularization strength (alpha_max) in skglm. Learn conditions for zero solutions in L1-regularized optimization problems.
the set of uncensored observations with the same time :math:`y_{i_l}`.
135
138
136
139
Again, we refer to the expression of the negative log-likelihood according to Efron estimate [`2`_, Section 6, equation (6.7)] to get the datafit formula
@@ -139,7 +142,7 @@ Again, we refer to the expression of the negative log-likelihood according to Ef
Copy file name to clipboardExpand all lines: doc/tutorials/intercept.rst
+3Lines changed: 3 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,8 @@
1
1
.. _maths_unpenalized_intercept:
2
2
3
+
.. meta::
4
+
:description: In-depth guide on intercept handling in skglm solvers. Covers mathematical derivations, gradient updates, Lipschitz constants, and examples for quadratic, logistic, and Huber datafits.
Copy file name to clipboardExpand all lines: doc/tutorials/prox_nn_group_lasso.rst
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,6 @@
1
1
.. _prox_nn_group_lasso:
2
+
.. meta::
3
+
:description: Detailed tutorial on deriving the proximity operator and subdifferential for the positive group Lasso penalty in skglm. Includes mathematical proofs and examples.
0 commit comments