- TIA EXAM 5 - WERNER CH 11

  1. Challenges to territorial ratemaking
    • 1. Tends to be heavily correlated with other rating variables
    • E.g., high value homes often located together
    • Makes traditional univariate analysis very susceptible to distortions

    • 2. Often analyze territory as collection of small units
    • Data in each individual territory is sparse
  2. Territorial ratemaking generally involves two phases
    • 1. Establishing territorial boundaries
    • 2. Determining rate relativities for the territories
  3. Describe how to determine Geographic Unit
    • Should be relatively homogeneous wrt geographic difference
    • Typical units are:

    • *Postal codes
    • A: Most readily available
    • D: Change over time

    • *Census blocks
    • A: Static over time
    • D: Must map insurance pols to them

    • *Counties
    • A: Static, readily available
    • D: B/c large, usually have very heterogeneous risks

    • Estimate geographic risk associated with selected geographic unit
    • Key to accurately estimating geographic risk is isolating geographic signal in data (Geographic & Non-geographic elements like weather indices & AOI)
  4. Two major issues calculating the Geographic Estimator with univariate techniques
    • 1. Sparse data results in volatile experience
    • 2. Location tends to be highly correlated with other non-geographic factors
  5. What technique is better to use for Estimating Geographic Estimator than univariate one?
    Multivariate approach with non-geo & geographic variables

    • Geo-demographic (pop density)
    • Geo-physical (avg rainfall)
    • Third party data
    • Can isolate signal from noise better
  6. Define Spatial smoothing tech & What are two Spatial smoothing techniques used to improve estimate of unit by using info from nearby units?
    Spatial smoothing used to improve estimate of unit by using info from nearby units

    • 1. Distance-based approach
    • Give weight to nearby geographic units based on distance from primary unit

    • Easy to understand & implement
    • No difference btwn rural & urban mile or natural/artificial boundaries

    Assumption tends to be most appropriate for weather-related perils

    • 2. Adjacency-based
    • Weight given to rings of adjacent units
    • Immediately adjacent units get most weight

    Tends to be most appropriate for perils driven bysocio-demographic characteristics (e.g., theft)
  7. Basic clustering routines
    • *Quantile methods
    • Use equal number of observations

    • *Similarity methods
    • Based on how close estimators are

    • Note - do not naturally produce contiguous groupings
    • Need to add contiguity constraint if that is desired
  8. Importance of ILFS growing for several reasons
    • 1. Personal wealth continues to grow
    • 2. Economic Inflation drives up costs
    • 3. Social inflation
  9. What are the two types of policy limits offered?
    • 1. Single limit:
    • Total amount insurer will pay for a single claim

    • 2. Compound limit:
    • Applies two or more limits to the covered loss
    • e.g. Personal Auto: split BI limit refers to per claimant & per accident
  10. What are the assumptions made when calculating ILFs?
    • 1. All UW expenses are variable
    • 2. Variable expense and profit don't vary by limit
    • 3. Usually that frequency and severity are independent
    • 4. Frequency is same regardless of limit chosen
  11. Why would you choose to vary the pro fit provision by limit
    • 1. Experience in higher limits can be volatile
    • 2. Less frequent, very severe
    • 3. Greater variability adds uncertainty so more risky and challenging to price
    • 4. May alter profi t provision to reflect higher cost of capital needed to support additional risk
  12. Give and example why frequency may vary by limit chosen
    E.g., Personal Auto - person who chooses high limit tends tohave lower frequency

    may be due to fact that individual choosing higher limit maybe more risk averse
  13. Additional Considerations when performing ILF Ratemaking
    • 1. Historical losses should be adjusted for expected trend
    • *Assume constant +ve % trend in total loss
    • Basic limit trend <= total limits trend <= increased limit trend

    • 2. Depending on age of data, claims may not be settled
    • *Ideally all claims should be developed to ultimate

    • 3. Losses may be censored from below if policy has a deductible
    • *Can add back deductible
    • *May not be possible to know how many claims were completely eliminated due to deductible
  14. Fitted Data Approach
    • Fit curves to empirical data
    • Smooth out random fluctuations
    • Common distributions include lognormal, Pareto, and the truncated Pareto
    • Image Upload 1
    • First TERM: loss amt for all clms < limit * prob of occuring
    • Second TERM: limit * prob of loss exceeding limit
  15. Multivariate Approach to ILFs
    1. GLMs can deal more eff ectively with sparse data

    • 2. Major di fference between GLM and univariate approaches using LASs
    • *GLM does not assume frequency is same for all risks
  16. Two basic types of deductibles
    1. Flat dollar deductible specif es a dollar amount below which losses are not covered by policy

    2. Percentage deductibles are stated as a percentage of coverage amount
  17. Some reasons deductibles are used
    • 1. Premium reduction
    • 2. Eliminates small nuisance claims
    • 3. Provides incentive for loss control
    • 4. Controls catastrophic exposure
  18. Other Considerations to deductible ratemaking
    • 1. Censored Data
    • *Ground-up losses may not be known due to fact that losses below deductible are often not reported
    • *Cannot use data for policies with higher deductibles to price lower deductibles

    • 2. Trend and development
    • * should be trended and developed
  19. Fitted Data Approach to deductible pricing
    • LER can be calculated given a continuous distribution of losses
    • Image Upload 2
  20. Practical Considerations to Deductible Pricing
    • 1. Claim behavior
    • *Method assumes insureds at diff erent deductible levels have same claiming behavior

    • 2. Choice of Deductible
    • *Lower-risk insureds tend to choose higher-deductibles

    • 3. Using univariate and multivariate classifi cation techniques will reflect behavioral di fferences in data
    • *Actuary may review these before selecting relativities

    • 4. Size of premium credit
    • *Depending on size of policy, the premium credit for selecting a deductible may exceed deductible
  21. Options to account for di fferent expected expense and loss levels for larger insureds
    • 1. Vary the expense component
    • 2. Incorporate premium discounts
    • 3. Include loss constants
  22. Three ways to adjust for large policies being overcharged for expenses
    1. Calculate variable expense provision that only applies to fi rst $5,000 of standard premium

    2. Expense Constant

    3. Premium Discount
  23. Explain why loss experience is generally better on large risks than on small risks
    1. Small companies have less sophisticated safety programs due to cost to implement and maintenance

    2. Small companies often lack programs to help injured workers return to work

    • 3. Experience rating plan provides an incentive for safety procedures
    • *Small insureds either unaff ected or only slightly impacted by ERP, thus less incentive to control losses
  24. Name two problems for when some homes are not insured to value
    Insured will not be fully covered X% of the time

    If insurer assumes all homes are insured to full value, will charge a rate that is not sufficient to cover expected payments on the home => rate is therefore not equitable
  25. Coinsurance apportionment ratio
    • Ratio of insured amount to either
    • (i) a stated sum, or
    • (ii) a speci fied percentage of the value of the insured property

    Max ratio is 1
  26. Coinsurance requirement
    Least amount of insurance for which apportionment ratio = 1
  27. Coinsurance de ficiency
    amount by which coinsurance requirement exceeds carried insurance
  28. Coinsurance penalty
    Amount (great than zero) by which an indemnity payment for a loss is reduced by operation of a coinsurance clause

    * Note: no coinsurance penalty exists if full policy face amountis paid, whether or not a coinsurance clause applies

    • e = L - I if L<= F
    • e = F - I if F < L < cV
    • e = 0 if L >= cV
  29. Insurance to value
    Property is insured to the exact extent assumed in the premium rate calculation
  30. When will the maximum coinsurance penalty occur?
    When L=F
  31. Formula to determine rate for a selected ITV
    • Image Upload 3
    • f = frequency of loss
    • s(L) = prob of loss of a given size
    • V = max possible loss
    • F = Face Value of policy
  32. The pure Premium Rate decreases as the policy face increases, describe the Rate of Premium Rate Change
    1. Small losses outnumber large ones, rates decrease at a decreasing rate

    2. Losses of all sizes equally numerous, rates decrease at a constant rate

    3. Large losses outnumber small ones, rates decrease at an increasing rate
  33. Insurance to Value Initiatives
    1. Guaranteed Replacement Cost (GRC) encourages insurance to full value

    2. Sophisticated property estimation tools

    3. Companies can generate additional premium without increasing rates by increasing AOI on underinsured homes

    • Industry has also made better use of
    • *Property inspections
    • *Indexation clauses
    • *Education of insureds
Author
CDP
ID
71959
Card Set
- TIA EXAM 5 - WERNER CH 11
Description
- TIA EXAM 5 - WERNER CH 11
Updated