Academics launch new blueprint for building mortality models
A new blueprint for constructing mortality models has been developed by academics at the Pensions Institute at Cass Business School, part of City University London.
The blueprint could transform the way pensions and annuity providers and governments forecast mortality rates.
Called the “general procedure”, the blueprint can be applied to any dataset to build a mortality model that fits all ages across the population.
In recent years, there has been an explosion in the number of new mortality models being developed. These models often involve ad hoc extensions to existing models that have questionable demographic significance. They also have difficulties in providing realistic forecasts of specific mortality rates.
“Rather than propose yet another new mortality model, we outline and implement a general procedure for building a mortality model from scratch,” says Professor David Blake (pictured), co-author of the research and director of the Pensions Institute.
“The general procedure is a way of constructing mortality models which are tailored to specific datasets. This means it is able to identify the idiosyncratic features of different populations which conventional ‘off-the-peg’ models are unable to do.”
When tested on UK data, the general procedure comfortably outperformed simpler models such as the Lee-Carter model. It produced more parsimonious models than those generated using a mechanical algorithm, such as principal component analysis. It also gave well specified cohort effects which are essential for reliable forecasting of mortality rates.
Existing mortality models cannot ensure that cohort effects – effects that influence the mortality experience of a specific birth cohort – are well specified as the age effects in these models are either set in advance without justification from the data, or are generated mechanically and so lack biological and demographic plausibility.
This will then cause problems when the model is used to project mortality rates into the future. The general procedure has been developed to address this issue and produce well specified mortality models.
“The procedure works by sequentially extending a simple mortality model – first with freely varying age effects which take whatever shape fits the data best and then replacing these with a simpler, parametric age function which does the same job. It uses a combination of expert judgement and a toolkit of functional forms,” says Andrew Hunt, another co-author of the study.
“This then achieves a good fit to the data with a relatively parsimonious model whose age effects can then be interpreted in light of the underlying socio-economic and demographic drivers of changing mortality rates.”
- By Category
- News from other sites
- Special Reports
- Partner events