We cannot guarantee that the content will display correctly while using Internet Explorer. To have the best browsing experience, please upgrade to Microsoft Edge, Google Chrome or Safari.

February 14, 2020

Patient data debates - the not so good, the bad and the ugly

Commentary
Tessy Huss

For anyone in the business of digital health, it is difficult not to pay attention to the patient data debates fought out daily in the news. In recent times much more light has been shone and scrutiny exercised when it comes to the data collection and processing activities of commercial entities - and perhaps even more so of those that handle medical or patient data. Yet, this does not seem to be much of a deterrent and companies keep getting caught up in ethically dubious practices. 


In this week’s blog we chime in on the patient data debates - we review some notable data controversies, including Practice Fusion’s kickback scheme, and discuss their implications for the digital health industry.


A colourful spectrum

Not all data scandals are the same - some are deliberate, some absolutely not, some are more deceitful than others, some are larger and more consequential. Some are not scandals at all, but rather cautionary tales which serve as reminders to question the motivations and ethics of those who we give our data to. In recent years numerous stories have triggered data debates in the digital health arena, and here are some of the most notable ones.



In late January of this year, it emerged that Practice Fusion was taking ‘kickbacks’ from an opioid manufacturer to incentivise opioid prescriptions via alerts built into its EHR software. It further emerged that Practice Fusion allowed pharmaceutical companies to influence the rules and select the guidelines built into its clinical decision support (CDS) system. Now, Practice Fusion has to repay $145 million to resolve the matter. Documents released by the Department of Justice in the US detail how the pharmaceutical and Practice Fusions’ marketeers conspired to design a pain CDS with blatant disregard for pain management guidelines and scientific evidence. Besides the obvious wrongdoing - the manipulation of the patient-doctor relationship - this case is also a powerful example of how a data collection and analysis exercise was misappropriated to compromise patient care with potentially long-term ramifications. 




In 2016, Deep Mind (a subsidiary of the Google conglomerate, Alphabet Inc.) announced its entrance into healthcare via a partnership with the Royal Free London NHS Foundation Trust aimed at speeding up the detection of acute kidney injury. The project was criticised for its “lack of clarity and openness” around the transfer and handling of identifiable patient information. Deep Mind acquired medical information for millions of British patients without their explicit consent, or having provided them with a notice of intent. Furthermore, the transfer of data and subsequent analysis of this information did not occur in a relationship of direct patient care, nor had the company obtained explicit research approval. Late last year, Google made similar headlines again, this time in relation to its Nightingale project which has led regulators to question Google’s compliance with privacy laws, while observers are predicting that the company will use emergent medical data for profiling and marketing. 


In 2018 the consumer genetic testing company 23andMe announced a four year strategic collaboration with GlaxoSmithKline (GSK) which would see both entities leverage genomic data to develop new medicines. The deal saw GSK gaining access to a proprietary database replete with de-identified information on 5 million individuals. A predictable commercial move perhaps, but this deal sparked surprise and anger from customers who were reportedly unaware of their data being used for secondary research purposes. Contrary to this belief, however, 23andMe only shares insights from consented data with its partners. A lawyer and bioethics researcher interviewed in Wired also points out that the company’s privacy notice is especially detailed and lays out all the specific parameters of its data processing activities. The company also operates under the oversight of an ethical review board. Unlike in academic research however, where participants are guided through the consent process by a researcher, consumers must navigate this (very wordy) process themselves. 

In July 2019, Surescripts, an electronic prescription management company, accused Amazon-owned PillPack of accessing patient prescription information via a third party back door. PillPack gathers patient prescription information prior to users’ signing up to its services to avoid erroneous manual entry and patient recall of prescription histories. Until now, PillPack received this data from ReMy Health which in turn sought this information directly from Surescripts. While analysts contend that the latest series of accusations may have been an effort to limit Amazon’s foray into the electronic prescription space, Surecripts alleged that ReMy Health used deceitful tactics to get the information to Pillpack. To boot, a few months later it also emerged that ReMy Health had committed fraud by sharing health insurance and prescription price information with third party drug marketing websites




In February 2019, Facebook found itself embroiled in yet another data scandal. A Wall Street Journal investigation uncovered that Facebook had been receiving personal (including potentially sensitive) information from numerous apps, including the Flo Period & Ovulation Tracker. Some months later, in September, a separate study by Privacy International found additional information on data sharing malpractices of menstruation and fertility tracking apps. The study found that the Maya (Plackal Tech) and MIA (Mobapp) apps, both of which have been downloaded by millions of people, shared data on intimate female health with Facebook before users even consented to the companies’ privacy policies. This was not the first time that Facebook has been criticised for how it acquired patient data, however.  Facebook’s very first foray into healthcare dates back to 2018 when the company sought access to anonymised patient data from major US hospitals. This project was stopped early on and never made it past the planning phase. And yet, 18 months after the Cambridge Analytica and menstrual app scandals, Facebook is revisiting its health ambitions with a Preventative Health tool which the company is developing in conjunction with the American Cancer Society, the American College of Cardiology, the American Heart Association and the Centers for Control and Disease Prevention.



Giving an inch but taking a mile

Even though the examples above are quite different from one another, what they have in common is that ultimately, at one point or another these companies used or acquired patient data in an effort to further commodify and monetise it. These examples also highlight how pervasive such practices are and that they are practiced at a very large scale. The problem is that ulterior motives are often concealed and sold as health improvement initiatives (which some of them undoubtedly are). Perhaps even more problematic, however, is the deception involved in progressing commercial objectives. While the nature of deceit is different in every case - asking consumers to agree to the T&Cs of ambiguous and opaque privacy policies, processing patient data without explicit consent, to the downright criminal manipulation of data with potentially harmful consequences as in the Practice Fusion example - these revelations are especially detrimental to an industry that is highly dependent on consumer trust. 

A survey conducted in 2018 revealed that out of 4000 people only 11% were willing to share their health data with a tech company. Of those, 60% were happiest to share their data with Google. (Only 49% and 40% felt ok about sharing their data with Apple and Facebook respectively.) The results are interesting, especially when we interpret them in the context of modern participation in society which is so contingent on sharing data with tech companies. It’s also interesting because ultimately we should want to share our data to reap in the benefits of technological advancements and to contribute to altruistic research endeavours. 

Seemingly, however, the mistrust is not confined to the public but is also felt within the industry. In late January of this year, Epic CEO Judy Faulkner wrote to hospitals and hospital systems urging them to oppose the US Department of Health and Human Services’ impending interoperability and information-blocking rule. The proposal authored by the National Coordinator for Health IT (ONC) suggests rules to support the “seamless and secure access, exchange, and use of electronic health information”. The rules seek to empower patients by giving them ownership over their healthcare data and provide them with the liberty of sharing this information with third party providers. Apparently out of concern for the confidentiality and privacy of patients, others within EPIC have gone on the record to caution that the new rules will inevitably serve the commercial interests of venture capitalists and others keen to capitalise on patient data.


While laws and regulations have come leaps and bounds in an effort to protect data subjects, it is important to point out that what is lawful is not necessarily ethical. Over the last few years the digital healthcare space has seen a proliferation of ethical principles, codes, guidelines and frameworks and while this is undoubtedly a positive development, some observers caution that this may lead to, what Luciano Floridi coined, ‘ethics shopping’ or ‘ethics bluewashing’. At the risk of sounding overly moralistic, the industry needs to start carving out a joint path to ethical profitability. 




We can provide an introduction on your behalf so that you can contact them directly with any questions/queries on this topic. Simply click on the link below to request an introduction.

We can provide an introduction on your behalf so that you can contact them directly with any questions/queries on this topic. Simply click on the link below to request an introduction.

Request an Introduction

We can provide an introduction on your behalf so that you can contact them directly with any questions/queries on this topic. Simply click on the link below to request an introduction.

Request an Introduction

Ready to read more? HealthXL members can access the full peak of the report on the HealthXL Community Hub.

No upcoming Virtual Events scheduled. Please check back again soon.

Are you a HealthXL Member? See the Full Report Here

Non-HealthXL Member? You can purchase the report HERE

HealthXL Digital Health Meetings

Want to join the discussion now?

Join our Digital Health Meetings and take a deep dive for 75 minutes into topics like digital therapeutics, patient support, telehealth, clinical trials, dermatology and many more. These Meetings are not regular webinars: No audience, no recording and no hiding behind a screen. You actively participate in a discussion to solve your present challenges and design the future of digital health. And with a free account on our Community Hub, you stay connected and up-to-date on the latest news and insights, allowing you to fully immerse yourself in the topic before, during and after.

23RD APRIL 2024 @ 11AM ET

Masterclass: Digital Health Regulatory Updates

The digital health industry is revolutionising healthcare delivery, but navigating the ever-changing regulatory landscape can be daunting.

Piotr Sokolowski
Featuring
Piotr Sokolowski
Head of Strategy, MedTech Solutions at S3 Connected Health

HealthXL Insider Boston

Join HealthXL on the 23rd of April in Boston as we discuss the History of Digital Health: Learning from our Mistakes to Succeed in 2024.
Featuring

Rethinking Hospital-Pharma Collaborations for Scaling Digital Health

Please bear with us. This meeting is under construction, details will follow shortly.
Featuring
30TH APRIL 2024 @ 11AM ET

Tech-Enabled Health in Cardiovascular Disease

The integration of technology in cardiovascular disease (CVD) management holds significant potential for pharma and healthcare alike.

Caroline György
Featuring
Caroline György
COO, Exploris Health AG

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

We cannot guarantee that the content will display correctly while using Internet Explorer. To have the best browsing experience, please upgrade to Microsoft Edge, Google Chrome or Safari.