Insurers have evolved from being large holders of primary data about customers, to being large buyers of secondary data about consumers. The ‘more data the better’ is a view that permeates the sector. Yet in evolving this way, insurers lay themselves open to the risk of unacceptable practices carried on by their suppliers, in this case data brokers.
What this file release by Wolfie Christl does is give a measure of just how large an exposure that could be. With 650,000 data segments up for sale by data brokers, characteristics of people relating to health, religion, political views, personal finance, employment, disability and ethnicity were being offered up to corporate buyers. Quite of few protected characteristics there.
Here are some examples of data segments in the file, that Christl found on the website of XandR, a Microsoft company:
- Religion > Jewish ;
- Health and Wellness > lifestyle indicators > health attitude : apathetic
- Voter profile > Lesbian / Gay / Bisexual / Transgender (LGBT) supporters
- Voter Profile > Pro-Choice supporters
And along with these segments is sold the data, for example the details of 2,523,034 ‘people who suffer from anxiety’.
Now some of you will argue that if someone suffers from anxiety, then that is how it is. Yet that is confusing correlation with causation. Most of these segments are populated with data created from a correlation of some personal attribute with location, shopping or online activity data. In other words, these are better than guesses, but often not by much.
Bigger Exposures
So why should insurers be bothered? After all, it’s not them doing this. Yet if they are buyers and users of large amounts of data from such brokers, that demarcation of ‘them, not us’ disappears. If they’re not carrying out proper due diligence in relation to clear obligations around equalities and privacy, then their exposure is actually more than that of the data broker. It’s more because insurers are business-to-consumer firms, while data brokers are business-to-business firms. The insurer’s exposure is reputational and strategic, as well as the legal and financial exposures faced by data brokers.
What you have here is a powerful combination of two things. Firstly, you have a campaigner who policy makers and the media listen to. And secondly, the points being made are backed up by data. This is evidenced based challenge, something to which insurers are finding themselves increasingly exposed (more here).
What insurers have to be good at then are the steps needed to avoid a cross contamination of challenges to data brokers over into the sector itself. One obvious big step is due diligence, both legal and ethical. It's ethical as well as legal, for while insurers may be allowed under equalities exemptions to use protected characteristics like sexual orientation or religion, I’m pretty sure they understand why, from the public's point of view, it would not be a good idea for them to do so.
An Easy Test
There’s an easy test for this, and that is to look at the board members of a typical large UK insurer. There you will find outside interests covering mental health, women's health, inclusivity and disability. Now frame a case to those board members for using data that is ‘little better than a guess’ relating to women’s reproductive health, disability and the like, in underwriting, claims, marketing and counter fraud decisions. In terms of career direction, the word ‘upwards’ does not come to mind.
One counter often used in relation to such concerns is anonymisation. It’s data but not personal data, goes the argument. This is at best a weak defence, as was once shown by a Harvard graduate student. On hearing that her state’s health insurance records were going to be anonymised and released into the public domain, she spend $20 and a little time tracking down which of the records belonged to the state’s governor and mailed a print of them to him. He got the point and the law was changed.
To Sum Up
Secondary data comes with ethical risk. Due diligence is a key tool for mitigating that risk, but only so long as your scope for it is set correctly. The problem is that that scope of usually just legal. That’s like driving down the motorway with a windscreen a fraction of the size we normally use.
That wider due diligence should address and deal with the questionable nature of a lot of the data being collected and sold by data brokers. After all, these data segments are being used because there’s a market interested in buying them. If your board members aren’t all comfortable using segments such as those I’ve mentioned above, make sure you’re not contributing to the market for them.