Why privacy solutions can come with ethical complications
Consider two tangents. We know that how data is shared raises important privacy questions. And we know that insurers join with third party suppliers for a range of data and analytics needs. So what happens where those two tangents cross? The answer is often presented in the form of powerful analytic tools that allow the sharing of personal data with suppliers, without compromising on its privacy. Job done? Well, for privacy, largely yes, but for other ethical concerns, far from it. Insurers should take care to consider those wider issues on the ‘data ethics landscape’.
Encryption is often put forward as the solution to data sharing. You encrypt the data, send it to the supplier, who then decrypts it and carries out their analysis, before sending it back to you re-encrypted. Nice, except that the privacy problem hasn’t gone away. Smaller, yes, but still present.
The decryption that the third party supplier has to perform in order to work with that data obviates the very goal you set out to achieve by encrypting it in the first place. The data is still vulnerable when it is in the supplier’s hands. And while the Senior Managers and Certification Regime still keeps the insurer accountable for what the supplier gets up to, it all feels too much like the proverbial stable door.
Is More Powerful Analytics the answer to Privacy Concerns?
The answer is said to lie with powerful analytic tools like homomorphic encryption (HE). It allows firms to share and process personal data in ways that still protect privacy and security. And it’s clever stuff. HE allows the supplier to carry out its analysis without having to decrypt the data. What’s more, the answer is then only known to the original provider of the data (in our case, the insurer). The data remains encrypted throughout, and the revealed insight kept within the originating party.
The end of privacy concerns? It feels like a big step in that direction. Yet is it so simple?
Homomorphic encryption may satisfy privacy concerns, but is that enough? What about other ethical concerns? Could tools like HE make those other ethical concerns more likely, more significant?
What powerful tools like HE create are opportunities to share data much more widely. And of course, such extended sharing allows all sorts of questions to be explored, about the person, about the risk being underwritten, about the claim being weighed up for settlement.
So what, you might ask. So long as the data remains encrypted, what’s the problem?
The problem is just because tools like HE allow for more sharing of data, it doesn’t mean that all such sharing is ‘the right thing to do’. Just because you can share that data, does mean that you should share it. Just because an insurer can explore disparate databases doesn’t make it right for it to do so. A couple of examples might help here.
Two Examples of Emerging Developments
Let’s say an insurer is presented with an opportunity to access data about their policyholders’ past infringements of the law. Finding this out wouldn’t break any rules relating to the rehabilitation of offenders, because the policyholder is not having to declare it. And let’s say that the insurer uses HE to pull insight from that data, that its counter fraud algorithms then use to tailor premiums, to tailor claims services, that bit more personally. After all, adverse selection tells them that it would be actuarially fair to do so. Yet the spirit of rehabilitation law says that offenders should be able to rehabilitate themselves after a period of time. And it is for the good of us all that they be able to do so. HE means that insurers could do this (I suspect some already are) but should they?
Another example. Let’s say that a genealogy firm allows an insurer, through its HE analytics, to access the genetic records it holds about its subscribers. This would allow the insurer to personalise the claims settlement it is negotiating with any of that genealogy firm’s subscribers, based on that person’s genetic record. Surprised? Then read this recent post on this blog about this development. Again, this wouldn’t be against the Code on Genetic Testing and Insurance, agreed between the Association of British Insurers and the UK Government. Yet is this a development that we want to see take off? HE makes this much more likely, but is it something that really should happen?
Technology Always has Implications
It’s all very well seeing something like homomorphic encryption as a clever piece of digital technology, but like many other technologies, it comes wrapped up with all sorts of social, ethical and political implications. And insurers need to engage with those implications. Better to do so proactively, rather than defensively, after the reputation damage has been done.
The big message here is simple: don’t let neat technical fixes trick you into thinking that problems have gone away. As I emphasise with clients I’m helping to assess ethical risks, the scope you give to those problems will define what in return you see.
Take that earlier reference to a ‘data ethics landscape’. If you only see what’s ahead of you, you will fail to see the storm coming in from the side that will turn your planned route into a dangerous dead end. Privacy is part of every insurer’s data ethics landscape, but there are other parts too. Push down on one, and you may find others rearing up.
Am I suggesting that you question what your privacy lawyer tells you? Not at all. I’m saying that privacy lawyers see privacy really well, but see other things less well. A good navigator of that data ethics landscape needs multiple skills: privacy is just one of them.
Some Points to Remember
To conclude. When it comes to clever analytical tools like homomorphic encryption, insurers should remember…
- ..that privacy impact assessments answer one question, not all of them;
- ..to conduct wider, ethical impact assessments as part of their due diligence process;
- ..that a genuine and properly used customer voice can help open up those other ethical issues;
- ..that the regulator could well be monitoring those data partnerships and expecting the executives named on SMCR responsibility maps to be asking the right questions.