-
Challenges to territorial ratemaking
- 1. Tends to be heavily correlated with other rating variables
- E.g., high value homes often located together
- Makes traditional univariate analysis very susceptible to distortions
- 2. Often analyze territory as collection of small units
- Data in each individual territory is sparse
-
Territorial ratemaking generally involves two phases
- 1. Establishing territorial boundaries
- 2. Determining rate relativities for the territories
-
Describe how to determine Geographic Unit
- Should be relatively homogeneous wrt geographic difference
- Typical units are:
- *Postal codes
- A: Most readily available
- D: Change over time
- *Census blocks
- A: Static over time
- D: Must map insurance pols to them
- *Counties
- A: Static, readily available
- D: B/c large, usually have very heterogeneous risks
- Estimate geographic risk associated with selected geographic unit
- Key to accurately estimating geographic risk is isolating geographic signal in data (Geographic & Non-geographic elements like weather indices & AOI)
-
Two major issues calculating the Geographic Estimator with univariate techniques
- 1. Sparse data results in volatile experience
- 2. Location tends to be highly correlated with other non-geographic factors
-
What technique is better to use for Estimating Geographic Estimator than univariate one?
Multivariate approach with non-geo & geographic variables
- Geo-demographic (pop density)
- Geo-physical (avg rainfall)
- Third party data
- Can isolate signal from noise better
-
Define Spatial smoothing tech & What are two Spatial smoothing techniques used to improve estimate of unit by using info from nearby units?
Spatial smoothing used to improve estimate of unit by using info from nearby units
- 1. Distance-based approach
- Give weight to nearby geographic units based on distance from primary unit
- Easy to understand & implement
- No difference btwn rural & urban mile or natural/artificial boundaries
Assumption tends to be most appropriate for weather-related perils
- 2. Adjacency-based
- Weight given to rings of adjacent units
- Immediately adjacent units get most weight
Tends to be most appropriate for perils driven bysocio-demographic characteristics (e.g., theft)
-
Basic clustering routines
- *Quantile methods
- Use equal number of observations
- *Similarity methodsBased on how close estimators are
- Note - do not naturally produce contiguous groupings
- Need to add contiguity constraint if that is desired
-
Importance of ILFS growing for several reasons
- 1. Personal wealth continues to grow
- 2. Economic Inflation drives up costs
- 3. Social inflation
-
What are the two types of policy limits offered?
- 1. Single limit:
- Total amount insurer will pay for a single claim
- 2. Compound limit:
- Applies two or more limits to the covered loss
- e.g. Personal Auto: split BI limit refers to per claimant & per accident
-
What are the assumptions made when calculating ILFs?
- 1. All UW expenses are variable
- 2. Variable expense and profit don't vary by limit
- 3. Usually that frequency and severity are independent
- 4. Frequency is same regardless of limit chosen
-
Why would you choose to vary the profit provision by limit
- 1. Experience in higher limits can be volatile
- 2. Less frequent, very severe
- 3. Greater variability adds uncertainty so more risky and challenging to price
- 4. May alter profit provision to reflect higher cost of capital needed to support additional risk
-
Give and example why frequency may vary by limit chosen
E.g., Personal Auto - person who chooses high limit tends tohave lower frequency
may be due to fact that individual choosing higher limit maybe more risk averse
-
Additional Considerations when performing ILF Ratemaking
- 1. Historical losses should be adjusted for expected trend
- *Assume constant +ve % trend in total loss
- Basic limit trend <= total limits trend <= increased limit trend
- 2. Depending on age of data, claims may not be settled
- *Ideally all claims should be developed to ultimate
- 3. Losses may be censored from below if policy has a deductible
- *Can add back deductible
- *May not be possible to know how many claims were completely eliminated due to deductible
-
Fitted Data Approach
- Fit curves to empirical data
- Smooth out random fluctuations
- Common distributions include lognormal, Pareto, and the truncated Pareto
 - First TERM: loss amt for all clms < limit * prob of occuring
- Second TERM: limit * prob of loss exceeding limit
-
Multivariate Approach to ILFs
1. GLMs can deal more effectively with sparse data
- 2. Major difference between GLM and univariate approaches using LASs
- *GLM does not assume frequency is same for all risks
-
Two basic types of deductibles
1. Flat dollar deductible specifes a dollar amount below which losses are not covered by policy
2. Percentage deductibles are stated as a percentage of coverage amount
-
Some reasons deductibles are used
- 1. Premium reduction
- 2. Eliminates small nuisance claims
- 3. Provides incentive for loss control
- 4. Controls catastrophic exposure
-
Other Considerations to deductible ratemaking
- 1. Censored Data
- *Ground-up losses may not be known due to fact that losses below deductible are often not reported
- *Cannot use data for policies with higher deductibles to price lower deductibles
- 2. Trend and development
- * should be trended and developed
-
Fitted Data Approach to deductible pricing
- LER can be calculated given a continuous distribution of losses

-
Practical Considerations to Deductible Pricing
- 1. Claim behavior
- *Method assumes insureds at different deductible levels have same claiming behavior
- 2. Choice of Deductible
- *Lower-risk insureds tend to choose higher-deductibles
- 3. Using univariate and multivariate classification techniques will reflect behavioral differences in data
- *Actuary may review these before selecting relativities
- 4. Size of premium credit
- *Depending on size of policy, the premium credit for selecting a deductible may exceed deductible
-
Options to account for different expected expense and loss levels for larger insureds
- 1. Vary the expense component
- 2. Incorporate premium discounts
- 3. Include loss constants
-
Three ways to adjust for large policies being overcharged for expenses
1. Calculate variable expense provision that only applies to first $5,000 of standard premium
2. Expense Constant
3. Premium Discount
-
Explain why loss experience is generally better on large risks than on small risks
1. Small companies have less sophisticated safety programs due to cost to implement and maintenance
2. Small companies often lack programs to help injured workers return to work
- 3. Experience rating plan provides an incentive for safety procedures
- *Small insureds either unaffected or only slightly impacted by ERP, thus less incentive to control losses
-
Name two problems for when some homes are not insured to value
Insured will not be fully covered X% of the time
If insurer assumes all homes are insured to full value, will charge a rate that is not sufficient to cover expected payments on the home => rate is therefore not equitable
-
Coinsurance apportionment ratio
- Ratio of insured amount to either
- (i) a stated sum, or
- (ii) a specified percentage of the value of the insured property
Max ratio is 1
-
Coinsurance requirement
Least amount of insurance for which apportionment ratio = 1
-
Coinsurance deficiency
amount by which coinsurance requirement exceeds carried insurance
-
Coinsurance penalty
Amount (great than zero) by which an indemnity payment for a loss is reduced by operation of a coinsurance clause
* Note: no coinsurance penalty exists if full policy face amountis paid, whether or not a coinsurance clause applies
- e = L - I if L<= F
- e = F - I if F < L < cV
- e = 0 if L >= cV
-
Insurance to value
Property is insured to the exact extent assumed in the premium rate calculation
-
When will the maximum coinsurance penalty occur?
When L=F
-
Formula to determine rate for a selected ITV
 - f = frequency of loss
- s(L) = prob of loss of a given size
- V = max possible loss
- F = Face Value of policy
-
The pure Premium Rate decreases as the policy face increases, describe the Rate of Premium Rate Change
1. Small losses outnumber large ones, rates decrease at a decreasing rate
2. Losses of all sizes equally numerous, rates decrease at a constant rate
3. Large losses outnumber small ones, rates decrease at an increasing rate
-
Insurance to Value Initiatives
1. Guaranteed Replacement Cost (GRC) encourages insurance to full value
2. Sophisticated property estimation tools
3. Companies can generate additional premium without increasing rates by increasing AOI on underinsured homes
- Industry has also made better use of
- *Property inspections
- *Indexation clauses
- *Education of insureds
|
|