Skip to main content

When you have deployed an underwriting engine, that is not the end of the journey. Quite the opposite: it is just the beginning.

You will have done extensive testing before launch to check that the rules are applying the same underwriting philosophy as your human underwriters, so you can rightly have a lot of faith in the original set-up. However, you can’t test absolutely everything – every possible case, every possible scenario – so there is a need to monitor what the engine is doing, and in particular how the rules are working. If something is not quite right, some adjustment needs to be made.

An obvious gauge of automated system performance is the straight-through processing (STP) rate, which is the proportion of cases on which a final underwriting decision is made without any human involvement. Is the STP rate what you planned for? If not, what can be done about it? Maybe not enough rules for the range of disclosures? Or maybe the rules are not making enough final decisions, leading to too many referrals to traditional underwriting?

Sometimes the questions leading through the ‘decision tree’ have not been phrased well enough so that applicants can only go so far and reach what they think is a dead-end; another case for human underwriting then. Or, worse still, the customer and the adviser, if there is one, just gives up and find an alternative insurer.

A good automated system will have an analytics module, perhaps part of a ‘workbench’ or ‘master control panel’ that will give a detailed view of how the tool is performing, for example:

  • Which rules are being used, and how often, and which are not
  • The decisions each rule is producing
  • The time taken to progress through it each time the rule is used – is it overly long?
  • Where in the decision tree any ‘drop-outs’ occur
  • Which disclosures don’t have a suitable rule and automatically go to manual underwriting.

The need for adjustments does not arise exclusively from rules performance analysis. Sometimes underwriting philosophy changes – maybe a risk factor needs a tougher stance, or can be handled more leniently. On occasion, new risk factors come along – COVID-19 is a good example.

And also, it could be that a rule is effective – that is, it makes the right decisions – but does it do so most efficiently? Could decisions be reached sooner? Are all the questions actually necessary? Different people write rules in different ways and sometimes there is room for improvement. Think about the customer journey and what you can do to make it easy, quick and smooth.

So there is a need to review the performance of the engine and the intelligence within it. But not just once after implementation: it’s an ongoing process. And of course, part of that process involves data analysis. Every input and action of the engine is recorded digitally and these ‘digital diamonds’ are vital to the ongoing success, efficiency and profitability of the business. So regular data analysis, as we have discussed in a previous article, is important to enable a detailed understanding of the business flowing in and the characteristics of the growing portfolio. That analysis is a multi-disciplinary task, and some of the findings and conclusions need to be fed into the rules development programme.

In this ongoing process of rules improvement and development, a small ‘engine maintenance’ team is required to monitor what is going on, make adjustments and develop the rule-set in order to meet the changing needs of the business and the changing risk environment. We have already mentioned new risk factors but what about revisions to the schedule of routine evidence requirements or a new application form wording? The rules in the automated system need to be updated.

Being a member of the rules team is a specialist role, and an interesting one. Not all underwriters have the ability to ‘think rules’ and be able to construct rules that are well worded in language that makes sense to ordinary consumers, and reach the right answers as quickly as possible. And these team members need to have a desire for constant improvement and maybe a lack of contentment with the ‘status quo’. It is potentially a varied role too, with contact with the many stakeholders in the rules and their outcomes, from sales managers through to product and corporate actuaries. Team head is not small role within the underwriting department either, given that automated processing is or will be the dominant method of risk triage and pricing – there’s a lot riding on the engine’s success.

So, there is more to underwriting automation than getting an engine, building a rule-set and deploying it. Remember: ‘launch and leave’ just isn’t an option.