We know that price optimisation is an established feature of many markets – hotel beds and airline seats for example. And it’s starting to be adopted in insurance markets, with some people saying that large chunks of the UK motor insurance market are now priced in that way. The question however that is now becoming associated with price optimisation in insurance is : is it fair?
Several states in the US have banned price optimisation in general insurance. And the National Association of Insurance Commissioners is currently consulting on whether to recommend that all US states ban it. Their basis for doing so is that price optimisation appears to be contrary to this near universal condition of US insurance law: “rates shall not be inadequate, excessive, or unfairly discriminatory.”
One factor behind this attention to price optimisation in the US general insurance market is a report issued in February this year by the White House: it concluded that price optimisation was fine in markets for such things like hotel beds and airline seats, but not in sectors that use risk based pricing, such as insurance.
Now, US insurers have been arguing that markets remain just as competitive, that price optimisation is not having the effect that its critics say it would have and that banning it is anti-competitive, anti-innovation.
I would not be at all surprised though if there is a widespread ban on price optimisation in at least the US personal lines market and the reason is that the insurance market there has not built a strong enough case for discriminatory pricing being a thing of the past. Doubts remain, and they are likely to remain for some time, largely because of questions that are being raised about some of the impacts that big data could have on how pricing decisions are made.
To illustrate this, consider a feature of big data that’s been behind some of those questions. It’s a feature called correlation clustering. What correlation clustering does is focus on the relationship between objects, rather than the representation of those objects themselves. When a strong set of relationships is found, a new piece of information is then associated with that person’s identity. And that piece of what is called ‘manufactured information’ is then used in further rounds of decision making, about the products that that person is offered, about the price that person is asked to pay.
And there is a growing body of evidence that correlation clustering can lead to discriminatory outcomes for consumers. Outcomes that have for example been influenced by a person’s race. Now this evidence has not so far come from the insurance market, but with the sector’s ever growing adoption of big data, through tools such as price optimisation, then it is, I would suggest, a risk that the sector needs to take seriously.
Now you may say “we’re not like that; we would never go down that road” and that’s great but it’s not enough, for the machine learning that is integral to big data means that you won’t know how that manufactured information is influencing your pricing decisions, until the damage is done and the headlines are made.
So I wouldn’t be at all surprised if we see a debate in the UK about whether price optimisation in insurance should be controlled, perhaps even banned. And it’s a debate that could get quite heated, for the issues it raises relate to social justice, and social justice attracts a more intense political debate than say, something like privacy.
And that’s just the technical side of it all. Think of it from the public’s perspective. If you explained the price optimisation of insurance to people on the Clapham omnibus, then for them, the unfairness of it really is a no-brainer. So if the FCA’s market study on big data in general insurance does have something to say about price optimisation, who in the market will be prepared to stand up and defend the practice on say BBC Radio 4’s Today programme? I can’t see any insurance chief executive doing so. And if that’s the case, then can insurers be sure that price optimisation is an ethical thing to do?