Skip to content

Commit f1ccea7

Browse files
authored
add new slide to elaborate frq-sev in insurance context (#14)
1 parent 84403af commit f1ccea7

File tree

3 files changed

+79
-16
lines changed

3 files changed

+79
-16
lines changed

notebooks/000_Intro.ipynb

Lines changed: 42 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@
4646
"We seek to create _principled_ models that provide explanatory inference and predictions of Marginal distributions $M$\n",
4747
"that are jointly coupled by a Latent Copula $C$, using quantified uncertainty to support real-world decision-making.\n",
4848
"\n",
49-
"<img src='../plots/000_jointplot_corr.png' width='480px'/>"
49+
"<img src='../plots/000_jointplot_corr.png' width='400px'/>"
5050
]
5151
},
5252
{
@@ -59,12 +59,12 @@
5959
"source": [
6060
"**Motivation:**\n",
6161
"\n",
62-
"+ A classic use-case for this model architecture (in the 2-dimensional setting) is insurance claims frequency and severity\n",
63-
"+ The `frequency` of claims and the `severity` of each claim each have marginal distributions and a natural covariance \n",
64-
" $\\Sigma$ between marginals $M_{0}, M_{1}$\n",
65-
"+ The joint product `frequency * severity = Loss Cost` i.e. the dollar value of insurable losses\n",
62+
"+ A classic use-case for this model architecture (in the 2-dimensional setting) is insurance claims aka incurred loss\n",
63+
"+ We decompose the dollar value of claims into two marginal distributions: the `frequency`, and `severity` of \n",
64+
" `expected loss cost`, because these measures are intuitive and can behave differently, with a (highly important)\n",
65+
" degree of covariance $\\Sigma$\n",
6666
"+ If we use a naive model that doesn't account for the covariance between `frequency` and `severity`, then the model \n",
67-
" predictions for `Loss Cost` can be hugely wrong!"
67+
" predictions for `expected loss cost` can be hugely wrong!"
6868
]
6969
},
7070
{
@@ -75,7 +75,41 @@
7575
}
7676
},
7777
"source": [
78-
"<img src='../plots/000_jointplot_corr.png' width='360px'/>\n",
78+
"### Quick Aside on decomposition of claims `frequency` and `severity`\n",
79+
"\n",
80+
"We can create different decompositions for different purposes, and according to the data available. A very useful one is\n",
81+
"shown here: to use the ratio of losses per unit of TIV, and thus generalise to policies of different TIV.\n",
82+
"\n",
83+
"$$\n",
84+
"\\begin{aligned}\n",
85+
"frq_{i} &= \\frac{claim\\_ct_{i}}{TIV_{i}} \\\\\n",
86+
"sev_{i} &= \\frac{incurred\\_total_{i}}{claim\\_ct_{i}} \\\\\n",
87+
"\\\\\n",
88+
"\\mathbb{E}_{\\text{loss} \\ i} &= frq_{i} * sev_{i} = \\frac{incurred\\_total_{i}}{TIV_{i}} \\\\\n",
89+
"\\end{aligned}\n",
90+
"$$\n",
91+
"\n",
92+
"where:\n",
93+
"+ Each policy $i \\in n$ (the dataset of all policies) can have it's own (policy-level) frequency ($frq_{i} \\geq 0$) and \n",
94+
" severity ($sev_{i} \\geq 0$) of claim (and thus policy-level $\\mathbb{E}_{\\text{loss i}} \\geq 0$)\n",
95+
"+ Note $frq$ and $sev$ tend to be zero-augmented distributions (where no loss is experienced): this is a very important\n",
96+
" aspect to include in more advanced model architectures\n",
97+
"+ $claim\\_ct_{i} \\geq 0$ is the count of claims incurred for policy $i$\n",
98+
"+ $TIV_{i} \\gt 0$ is the Total Insured Value (TIV) for policy $i$\n",
99+
"+ $incurred\\_total_{i} \\geq 0$ is the total incurred losses for policy $i$\n"
100+
]
101+
},
102+
{
103+
"cell_type": "markdown",
104+
"metadata": {
105+
"slideshow": {
106+
"slide_type": "subslide"
107+
}
108+
},
109+
"source": [
110+
"#### Back to this presentation's focus on the copula function\n",
111+
"\n",
112+
"<img src='../plots/000_jointplot_corr.png' width='300px'/>\n",
79113
"\n",
80114
"\n",
81115
"**Demonstration:**\n",
@@ -89,7 +123,7 @@
89123
" + We create a series of principled copula models using advanced architectures and Bayesian inference to fit to the \n",
90124
" data and estimate the covariance on $M_{0}, M_{1}$\n",
91125
" + The first model is naive and ignores the covariance, the final model is very sophisticated and estimates the covariance\n",
92-
" + We demonstrate **a substantial 32 percentage-point improvement in model accuracy** when using a copula-based model\n",
126+
" + We demonstrate **a substantial 33 percentage-point improvement in model accuracy** when using a copula-based model\n",
93127
" + This correct estimation would likely make the difference between profitable pricing / accurate reserving, or greatly loss-making business over a portfolio."
94128
]
95129
},

publish/000_Intro.pdf

20.8 KB
Binary file not shown.

publish/index.html

Lines changed: 37 additions & 8 deletions
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)