16 September 2013
Walking the tightrope
The health sector around the world is in a ferment of new ideas about health information. Innovations are everywhere, from shared care records to remote body monitors to health and fitness apps. Innovation can be viewed as a tightrope between the past and future. Enticing, yet treacherous.
And that’s why we need information privacy guidance. Information privacy gives people some control over their information in the face of the technological developments that are lessening that control. A sort of pole to keep us steady on the tightrope.
Health information is deeply personal, and deeply important, to all of us. And clinical confidentiality, which allows us to disclose that information to health professionals entrusted with our care, is equally important. The strongest asset that anyone who works in health information technology can have is not their servers or network infrastructure, it’s the trust of those they provide care for.
So creating conditions of trust for patients, practitioners and organisations should be an absolutely central goal for everyone working in health.
As Privacy Commissioner I am often viewed as an advocate for balance. And the most important aspect of a balancing act is not falling off. But what we’ve seen in the last year is a lot of very high profile tumbles: ACC, MSD, EQC.
It’s reasonable to ask why we haven’t had a similar high profile breach from a health agency, and there are two possible responses I might give. First is that it hasn’t happened yet but it’s going to, because it’s inevitable; Murphy’s Law, if you will.
Second is that it has happened, but the outrage has been relatively muted. The health sector has an enormous amount of goodwill in the bank. I suspect that means that when a doctor gets caught snooping on a famous sportsman’s file or sending round photos of a patient with an eel in a sensitive place, even though the individual clinician will be disciplined, the health system as a whole gets off lightly.
But there might well be a big privacy breach in the health sector around the corner. And if it’s high-profile enough, the innovation tightrope might be an uncomfortable place to be.
Because while polls I have conducted show the health sector has a sky-high approval rating, goodwill takes a long time to earn back once you’ve lost it.
A senior doctor at Johns Hopkins in the US calculated that, in 1970, the average patient had 2 and a half FTE clinical staff involved in their care. By the end of the 1990s, it was more than fifteen. And it will be even larger today – medicine isn’t getting simpler
Shared care like that needs shared information. That means finding new and more efficient ways to get the right information to the right person at the right time. Making the best use of the information is a pressing need Information sharing is vital to health care
Doctors, hospitals, laboratories and pharmacies all need to share information to be able to provide care to patients
However the risk is that the galloping advances in information sharing that new technology offers will leave patient trust behind
For instance, there are some major sensitivities around the boundaries between health and social services information. Particularly where it relates to children. Drugs, alcohol and gambling can all raise addiction issues. Low income is a significant indicator for some health problems. And domestic violence has been approached as a health issue to be tackled through screening.
Well-intentioned efforts to address real social harms may risk damaging the vital trust that people have in their medical professionals. Patients who go to their doctor shouldn’t feel they’re going to the local representative of Big Brother
But I think it’s possible to strike an appropriate, careful balance between disclosure and confidentiality. And if you do this, then privacy can be a roadmap, not a roadblock.
Health professionals in New Zealand are doing a good job here – our most recent survey put the health sector at the absolute peak of peoples’ trust, with a sky-high 94% approval rate.
Electronic health information technology doesn’t change that, but it raises some important concerns. For instance, when you have a huge store of electronic data, it means that things can go wrong faster and with more damaging effect.
As we have seen with the public concern over the US government tracking metadata of phone calls, big data can be profoundly intrusive. If you know everywhere I’ve been, and everyone I’ve spoken to, you know an awful lot about me even if you don’t know exactly what I was saying.
Another problem is the function creep of using large datasets for new purposes. Collections of data need to have a purpose attached to them, and breaching that purpose is essentially a breach of trust with the people whose information it is.
So whether putting together a new system, or repurposing an old one, you should consider those purposes carefully.
Of course new information systems can be valuable. Researchers can use them to identify new health risks. Bureaucrats can use them to help get the most health bang for scarce health bucks. And, last but definitely not least, GPs can use the information to improve their clinical relationship with the patients and the care they give their patients.
But ultimately all this information comes from the patient, and the GP. Both those people have the ability to turn that tap off. And they will, if the vital clinical relationship is threatened.
Overseas experiences in the US, the UK, Australia show that when clinicians get uncomfortable with how their patients information they react by withdrawing their information. And while I support the need for sophisticated health information technology, it’s people like GPs who will continue to be the first line of defence for their patients’ privacy.
I once read that the wise learn from the mistakes of others, a fool only learns from their own and most people never learn at all.
So as as you walk the innovation tightrope, I suggest that you be wise, put one foot in front of the other, and try not to look down.