Technology is revolutionising healthcare at all stages of the patient journey. From remote GP appointments and wristbands that count our steps, to 3D printers producing human cells, and robots carrying out surgery, health-tech startups are using artificial intelligence (AI), machine learning and wearables to create more personalised, accessible care. And at the heart of it all is data.
This information is paramount to the evolution of healthcare. But big data requires great responsibility. Here’s why privacy needs to be integral to health-tech innovation.
Health-tech businesses rely on building and sustaining trust with their users. Individuals need to feel comfortable sharing their most personal data with a commercial entity. Many patients are suspicious of such an exchange. In a global survey by Accenture of more than 7,800 people, 55% do not trust tech companies to keep digital health information secure. In 2019, information about millions of NHS patients was found to have been sold to pharmaceutical companies abroad, and plans to create a central database using anonymised GP patient records were delayed this year after a public outcry. However there is evidence of an appetite to embrace change. Accenture found 27% of respondents are willing to try virtual care from companies such as Google and Microsoft, Amazon (25%) and medical startups (21%). Transparency is crucial – patients want to focus on getting better, not constantly checking their privacy settings.
Health-tech entrepreneurs can accomplish amazing things, given access to the right data. But in health, more so than any other sector, the patient-business relationship is sacrosanct. Healthcare by its very nature is emotional, so there is no room for error. Get privacy right and you’ll create loyal customers that believe in your business. Unwittingly lose personal health data and you could traumatise a patient, open yourself up to litigation, or face a barrage of bad reviews on social media. Users and their best interests must be put first. The regulator has its eyes on this sector too. In 2015, for example, the Information Commissioner’s Office (ICO) found the Royal Free London Trust breached data protection law by transferring the patient data of 1.6m people to Google’s DeepMind.
Medical information is reportedly among the most valuable on the black market. There has been a boom in ransomware attacks affecting healthcare. Cyber criminals believe they’re more likely to be paid because of the nature of the service. In 2020, for example, fitness wearables company Garmin paid a reported $10million to free its systems and there have been a number of attacks on public health services across Europe. In 2017, the WannaCry attack impacted more than a third of English NHS Trusts, forcing 7,000 appointments to be cancelled. In Germany, the number of successful cyber attacks on health service providers operating critical infrastructure more than doubled in 2020, compared to the year before, while France reported 27 major cyber attacks against health institutions last year.
In the UK alone, the health-tech sector has attracted more than $7.7billion from investors over the past five years, making it the second biggest category in the national tech sector. Technology giants such as Facebook, Apple and IBM are all desperate to expand into healthcare. Amazon recently launched a wristband that tracks health, and Google will reportedly pay $19.7billion to purchase Nuance Communications, a pioneer in conversational AI for healthcare. The potential for multi-million-pound deals is huge, but privacy is one important strand in the due diligence process. Investors want to know that a company has the right procedures, training and culture in place to prevent a future fine from the regulator or reputational damage in the case of a security breach.
Health is a highly regulated sector. As well as data protection and privacy, there is strict guidance governing medical devices including software, patient care and confidentiality, clinical trials, governance, advertising, public procurement and product liability. The EU introduced two new regulations on 26 May, 2021 – the Medical Device Regulation and the In Vitro Diagnostic Medical Device Regulation – and the UK recently adopted a new NHS code of conduct for AI systems, considered a potential precursor to wider regulation. The Privacy Compliance Hub provides a clear and easy-to-understand checklist that employees can follow and implement, negating the need to remember each step. With 90% of data breaches down to human error, it’s imperative your team has the tools it needs to meet regulatory demands.
Build a culture of continuous privacy compliance
At the Privacy Compliance Hub, we make compliance easy for everyone to understand, care about and commit to. We call it a culture of continuous privacy compliance. Our platform, created by two ex-Google lawyers, provides a structured programme to follow, giving you confidence you’re keeping your customers, investors and the regulators happy. Discover how it works here.