You are on page 1of 7

# SEVERITY, FREQUENCY & AGGREGATE LOSS Conditional Probability Pr[ ] Pr[ 0.

5]
Basic Probability Pr[ > ] Pr[ + 0.5]
Policy Limits
Probability Functions

() = () = Pr( ) Bayes Theorem

() = 1 () = Pr( ) = ()
() ()
() = =

() ()
() = = Law of Total Probability
()

() = () = () Deductibles

Payment per Loss
Moments

## Conditional Expectation Formula

[] = [ [|]] Payment per Payment
Variances

3 3 32 +2 3
Variance of the Sample Mean
Skewness: 1 = =
3 3

4 4 43 +62 2 34
Kurtosis: 2 = =
4 4
Mean Excess Loss
2]
1 2
[ = () = [( )+ | > ]

Bernoulli Shortcut Special Cases

Percentiles

## Conditional Variance Formula

Mode
Aggregate Loss Franchise Deductible

=+
1
Finding a, b
For Poisson: [] = [ 2 ] Franchise: Y L , Y P [ ] = [ ] + ()
Set = + = 1, { L P {
1
Frailty Model Ordinary: X , X [ ] = [ ] +
Mode = greatest integer that < k Bonus
(|) = () ,
Moment Generating Function ( )+ = {
(|) = () 0, >
,
( ) = {
(|) = ( |) = () = 0, >
Probability Generating Function [] = [( )+ ] = [ ( )]
S() = [(|)] = [ () ] = [()]
= [ ]
Calculate Variance: Use 2nd and 1st moments: Loss Elimination Ratio
2] 2
[] = [ []

## ()(0) Continuity Correction

= Pr( = )
! Pr[ = ] Pr[ 0.5 + 0.5]
Pr[ ] Pr[ + 0.5]
Pr[ < ] Pr[ < 0.5]
Severity: Coverage Modifications EMPIRICAL MODELS
Review of Mathematical Statistics

## Frequency: Exposure & Coverage

Modification

[()]

Empirical Distribution for Complete Data [()] =

()2
1
=

Log-transformed CI of S(t)
= [()1/ ; () ]
= [() ; ()1/ ]
Aggregate Loss Models Variance of Empirical Estimators with = [ . ; .1/ ]
Complete Data ()
()
()
Risk Measures ()
LinearConfident Interval

## Empirical Estimate for () () (1+)/2 [ ()]

#()
Calculate () base on Kernel Smoothing

() = ()
Density and Distribution Functions
Kaplan-Meier and Nelson-Aalen Estimators

## Kaplan-Meier (Product Limit) Estimator

1

() = (1 ) , 1 <

=1

Nelson-Aalen Estimator
1
TVaR p : Coherent (4 properties)
() = ,
1 <
=1
E[X]: Not Translation
For Incomplete group, individual Uniform Kernel
Var[X]: only Translation
[X]: Not Monotonic = #{ } + #{ } #{ }
TVaR p TVaR p for Pareto Variance of Kaplan-Meier and Nelson-Aalen
Known ; VaR p VaR p = Estimators
1
TVaR p TVaR p = + =
1 1 Triangular Kernel
Tail Weight Measures
Moments of Kernel-smoothed Distributions Right truncated: there is no info on data more =
2
(): []
than u
2

## Right censored: we only known that some 2

data points is/are more than u. (limit). =
(): []

Left truncated Right censored:
o Not right censored: f(x)/S(d) ( + 2) 2
=
(): []
o Rightcensored: S(u)/S(d)
Left truncated Right truncated: Delta Method
o f(x)/[S(d)-S(u)] or f(x)/[F(d)-F(u)]
One variable
Left censored Right truncated:
Mortality Table Construction 2
o Not Left censored: f(x)/F(u) (()) ()( ())
:
o Left censored: F(d)/F(u)
Two variables
Left censored Right censored
o Not Left/Right censored: f(x) 2 2
((, )) () ( ) + () ( )
o Left censored: F(d)
o Right censored: S(u)
+ 2(, )
Individual Data
Fitting Discrete Distributions
Two methods to fit data to an (a, b, 0) class
distributions:
Method 1: Compare 2 to .

Method 2: Calculate ( ) for the first
1
Grouped Data
PARAMETRIC MODELS few values of k and observe the slopeof the
( ) (1 ) line created from these values
Method of Moments
Grouped data between d and cj and left- Method 1 Method 2
To fit a k-parameter distribution, set: truncated below at d: [( ) ()]/() Poisson 2 = = 0, =

1 Binomial 2 < <0
[ ]
= for = 1,2, , MLE=MOM Neg.Binomial 2 > > 0, 0

=1
Poissons Geometric 2 > = /((1 + ),
Percentile Matching Binomials ( is known) Geometric =0

Smoothed Empirical Percentile Negative binomials ( is known) =+
Gammas ( is known) Exponential 1
= [( + 1)] observation
Normals and Poisson individual; ~Gamma(, )
If ( + 1) is not an integer, interpolation
between the order statistics before and after the MLE shortcut . ( = ; = )
[( + 1)] observation Neg. Binomial Frequency (; );
Likelihood function has this form: Exponential()
Percentile Matching with Incomplete Data () = / ( = ) +
With censored data, select percentiles within The estimated: ( = (1 + )
the range of the uncensored observations;
= Truncated & Zero Modified Distribution
With truncated data, match the percentiles
of the conditional distribution. 1
Variance of Maximum Likelihood Estimators =
(Asymtotic Variance of Parametric) 1 0
Maximum Likelihood
Fishers Information (Inf. Associated w MLE) 1 0
Steps to Calculating MLE = = (1 0 )
1 0
1. () = () One variable
2. () = ln () () = [ ()] Sum Distribution
3. Set () = 0 () = [()] 1 Sum n indpt Neg. Binomial (same )
4. Solve for
Two variables ~. ( ; )
Likelihoods
Sum n indpt Binomial (same )
Left truncated: there is no info on data less
~ ( ; )
than d (deductible)
Left censored: we only known that some data Sum n Normal ~ (, 2 )
points is/are less than d. Covariance matrix of the = 1 (1 , 2 ) ~(, 2 )
Hypothesis Tests CREDIBILITY Posterior density = (|1 , , )
()(1 , , |)
() = 1 ()/() Limited Fluctuation Credibility =
(1 , , )
Reject: Test Statistic > Critical Value Full Credibility
Predictive density = (+1 |1 , , )
Basic premise: Pr(| []| [])
D(x) Plots Where R = frequency, severity, or aggregate loss = (+1 |). (|1 , , )
() = () () N = frequency
= (+1 |). (|1 , , )
p-p Plots X =severity
#obs
Plots empirical distribution, ( ) = , on 1+ Bayesian credibility estimate of +1 =
+1
x-axis and fitted distribution on y-axis. 1+ = 1 ( ) Mean of predictive distribution =
2 2
E[+1 |1 , , ]
Kolmogorov-Smirnov Test
Number of claims needed for Full Credibility
Test statistic, = [ ]; where: Loss Functions
of
= [| ( ) ( )|, | ( ) ( )|] Number of (1+)/2 2 2
claims = [ ] ( )
Only for individual data.

Lower critical value if < Claim size (1+)/2 2 2
= [ ] ( 2)
If parameters are fitted, critical value
should be lowered. Aggregate (1+)/2 2 2 2
Larger sample size has lower critical value. losses = [ ] ( + 2)

Uniform weight on all parts of distribution
Chi-square Test 2 1 2 2 Conjugate Priors
= ( + )
2 2 2 2 Predictive mean = Posterior Mean
( ) 2
Test statistic, = = If frequency is Poisson, then = 2 Predictive Mean=Buhlmann estimate

=1 =1
Number of exposures (expected claims) needed Poisson/Gamma
If r parameters are estimated using the same
for full credibility:
data used to calculate the test statistic, then
there are 1 degrees of freedom.
=

May be used for individual data or group
data Partial Credibility
Normal/Normal
No adjustments on critical value if < = + (1 ) = + ( )
If parameters are fitted, critical value is
automatically adjusted. = Number of claims observed
Critical value is independent of sample size. / , <
Higher weight on intervals with low fitted ={
1,
probability. = Number of exposures observed
Likelihood Ratio Test Bernoulli/Beta
/ , <
={
Test statistic = 2(1 0 ) 1,
Degrees of freedom Bayesian Estimation and Credibility
= # free parameters in alternate
Posterior: revised prob for Parameter, given
# free parameters in null
Data k - # of success.
Critical value from Chi-square distribution.
Predictive: Prob of Next Claim, given Data. Exponential/Inverse Gamma
Schwarz Bayesian Criterion and Akaike
Density
Information Criterion
Prior density = ()
SBC = ln ln Conditional density = (|)
2
Unconditional density = (1 , , )
AIC = ln
= (). (1 , , |)
n = number of data points
r = number of parameters in model = (). (1 , , |)
Choose model with highest penalized value.
Buhlmann Credibility Estimating

= ()
= [[|]]

= ()
= [[|]]

= () Applications
= [[|]] + = () =
(1 ) + = ( . )
= , = =
+ +
SIMULATION
= + (1 ) = + ( )
Inverse Method
Note: is the denominator of the term used to
= 1 ( )
calculate
Empirical Bayes Non-Parametric Methods Special Techiniques
Multiple Decrements Bootstrap Approximation
r(i): of groups
1
n(j): of years
~Binomial ( = , = )
mij : of PH each years/group 1 1
=1
=1
1. Cal xij
Simulating (a, b, 0) Class Distributions
2. Cal xi
(Stochastic Simulation)
3. Cal
DISTRIBUTION
mij (xij xi )
4. Cal v =
(years 1)each groups (, )
mi (xi )2 v(group 1)
5. Cal a = () =
m m1 m2i ( + )+1
mi
6. Cal Zi =
mi + k () = 1 ( )
+
Uniform Exposures
Simulating Poisson Variable, the rate at which
[] =
the next event occurs is The mean amount 1
of time until the next event occurs is 1/. 2 2
[ 2 ] =
The time until the next variable is exponential ( 1)( 2)
with mean 1/.
!
[ ] =
Normal Random Variables: The Polar ( 1) ( )
Method 2
[] =
( 1)2 ( 2)
Non-Uniform Exposures

[( )+ ] =
( 1)( + )1
+
[( )+ ] = () =
( 1)
1
Number of Data Values to Generate ( ) = [1 ( ) ] ( 1)
1 +

Empirical Bayes Semi-Parametric Methods ( ) = ( ) ( = 1)
+
Estimating
1
= 1 ( )
+
= [(1 )1/ 1]
(1 )1/
= + () = ( = , ) =
1
Mode = 0 / () = ()
() =
2 2 /2
[, ]: [ ] = +
() = 1 /
= + (, )~( + , )
=1 ( + ) +
=1 ( + ) [] = () =
() = +
[ 2 ] = 2 2
~ (, ) ( )
( )| > ~( + , ) [ ] = ! () = [].
1
[] = 2 Y~Normal (, )
~ (, ) { X~Lognormal(, )
() = ln(1 ) X = eY
1
= ln (1 + ) ~ ( = ) +
() = ln(1 ) + 1
( ) = ( ) = []
() = + ( ) = (1 / )
=1 ( + ) +
=1 ( + )
=1

= 1 /
(, ) 1
+
Mode=0 () = ( ) 2 = []

() = ( > ) + =1
+1 1
() = ( )
(, )
=1
() = 1 ( ) ( > )
() () = +
( )
[] = / () = +
1 () = 1
2 2
1 2 /2
[ 2 ] = () = / ( ) =
( 2) 2
[] = () =
[ ] = (, )
( ) [ ] = (1 ) ( < 1)
1
2 () = (ln )1 () =
[] =
( 1)2 ( 2) Mode = /2
() =
[( )+ ] = () 1
() =
( ) =1 1/ +
[] =
= { 1 (, ) 2
( ) +
( ) ( )2
1 (/) / 1 / [] =
() = = 12
() ()
[( ) ] = ( ) 3
3
( ) ( ) () = (, /) [ 2 ] =
3( )
1 [] =
= 1 1 ( > 1, ) () = ( ) =
] (
[ = + 1)
[, ]:
[] = 2 :

= + Mode = ( 1) : .
=1 ( ) +
=1 [(, )]

~ (, ) (, ) 2
() =
(/) / ( + 2)( + 1)2
( )~(, )
() =
() (, , ) ( = = )
() =
+
=1 ( ) +
=1 [(, )] () = 1 (, /) ( + )
() = ( ) [1 /]1
[] = /( 1) ()()
(, )
(, )
1 [] =
() = 2 /2 +
( + )+1
() = 2
2 [] =
() = ( ) ( + )2 ( + + 1)
+
If a=1:
[] = [(1 )1/ 1]1/ () ~ . =
() = ( < )
1+
(, , )
() = =
=1 ( ) (/) (1 + )+1
() = 1
[1 + (/) ]+1
( ) = 0 =
1+
=1 ( ) (/)
() =
(, ) (1 + (/) )
= ; =0
1+
(/) [] = [1/ 1]1/
(/)
() = [] =
(, , )
[] = (1 + )
() = 1 (/) +
[] = [] = [1 ( 1)]1
[ ] = (1 + /) 1
(, )
[] = [(1 )]1/ [] =
(1 )2 ( + 1) ( + 1)
+ + 1/
1 =
() = [ ( )] () ! (1 + )+
1
=1 =1 0 =
= (1 + )
(, ) !
( 1)
(/) (/)
0 = = ; =
() = 1+ 1+
[] = [] =
(/)
[] =
() = 1 [] = (z1)
[] = (1 + )
[ ] = (1 /) (, )
[] = [1 ( 1)]
[] = [()]1/ = (1
)

(, , )
0 = (1 )
(/) ( + 1)
() = = ; =
[1 + (/) ]+1 1 1
() = 1 [1 + (/) ] [] =
[ ] = (1 /) [] = (1 )