Privacy is an ethical issue that the UK insurance sector often thinks about, usually in terms of it being something of a headache. And privacy is a serious ethical issue for how insurance firms use data, but it is not the only one. Another one that should be on their ‘ethical radar’ is often referred to as social sorting. And if privacy is often thought of as a headache, then social sorting could be the stuff of nightmares. Here’s why.
We know insurers are very busy at the moment assembling bigger and bigger collections of data. And one of the main things they’re doing with it is sticking labels onto all that data; in other words, categorising it. In the past, this involved only a few labels: your age, where you lived, what type of car it was and so on. And you could see that this simple categorisation was obviously risk related.
Lifestyle underwriting
As more data is collected, and insurers move much more towards lifestyle underwriting, then things become more complicated. Datasets become bigger and more varied, which means that more and more categorisation is needed. This categorisation allows marketing people to slice and dice a whole warehouse of data for insight into who will buy what when ; and allows underwriters to slice and dice for insight into propensity to pay and propensity to claim.
The insurance sector is awash with data brokers and software houses promising all sorts of ways to boost this insight. Some of the categorisations I’ve seen on offer seem far from neutral and objective. The biases in those categorisations may be only annoying if it results in you receiving a lot of irrelevant marketing offers, but if an underwriter bases a large part of her risk assessment upon them, then various groups in society will find access to insurance becoming much more difficult.
The Mobile World
And this process of inclusion and exclusion will become much more subtle, as we begin to be categorised in a much more fluid and mobile world. We become categorised not so much by what our IP address says about where we live, but much more by what the ‘internet of things’ says about everything we get up to in our daily lives.
So we’ll find that these doors will be opening and closing at a much greater speed, and for a wider range of people, than in the past. So this process, what has been called ‘social sorting’, will be experienced not by just a few groups of people, but increasingly by many of us.
And we won’t like some of the things it says about us; about some of the products that become too expensive or some of the services that become difficult to access. The big danger is that this increasingly complex process of sorting of risk, of differentiating between risks, could begin to look more like discriminating between risks. Insurance regulators in the United States refer to this as red-lining, after the underwriting of mortgage risk turned into a complete nightmare for the financial services sector in the US. The issue becomes so highly charged because what you’re talking about is social justice and that attracts a completely different order of political debate than privacy.
Red-lining
Now, every insurance firm would be thinking that this is not a road it would ever go down – “we’re not like that”. And to a large extent, they’re right, but when I hear personal lines underwriters talk about no longer knowing how their end premiums are calculated because the process for calculating them has become so complicated (all those different data sources and algorithms) and I read about some of the accusations being leveled against data brokers serving the US financial services market, then putting those two things together, I worry that some firms may already be unintentionally, unwittingly, straying in that direction.
As an ethical risk, I think that ‘social sorting’ has the potential to be as big an issue for the public as payment protection insurance: perhaps not in financial terms, but certainly in reputational terms. If it were to happen, then regulation of insurance pricing would almost certainly be introduced in the UK and it could decimate the reputation of the UK insurance market for a lifetime.
So what do you do? I would suggest you start with the essentials:
- you learn the language of data, risk and ethics ;
- you set some principles down and make sure everyone knows about them ;
- you start helping people make decisions in line with those principles ;
- you help people overcome any difficulties they encounter in doing all those things.