Introduction
The advancement of artificial intelligence (AI) in healthcare information management is an unpredictable journey, both thrilling and daunting, akin to the wild west. Healthcare accounts for over 30% of the world’s data volume, highlighting its significant impact. While the exact data output from wearable devices remains uncertain, research shows that about one in three American adults use these devices to track their health. By skillfully harnessing and synthesizing this vast data with AI, we have the potential to revolutionize patient care. But will we succeed? This article delves into whether data from wearable devices can genuinely lead to better care. Let’s explore the possibilities together.
AI in Healthcare
AI as used in this article “refers to the capacity of computers or other machines to exhibit or stimulate intelligent behavior.”[1] AI encompasses machine learning (ML), which “is a family of statistical and mathematical modeling techniques that use a variety of approaches to automatically learn and improve the prediction of a target state, without explicit programming.”[2] Natural language processing (NLP) enables computers to understand and organize human speech.[3]
AI technology in healthcare is not new. For example, over sixty years ago development began on ELIZA, the first chatbot using natural language processing to imitate a human therapist.[4] In 1971, development began on INTERNIST, an algorithm-based system able to make internal medicine diagnoses.[5] In 1972, development began on MYCIN, a rule-based system able to diagnose and suggest therapies for infectious diseases.[6] In 1995, FDA approved PAPNET Testing System, which was an AI/ML enabled medical device.[7] In 2022, Chat GBT caught the public’s attention.[8]
Even though AI has been around for a long time, it never really ignited dramatic change until recently. What changed? In 2017, the U.S. Department of Health and Human Services published a study by JASON (an independent group of elite scientists) exploring how AI might shape the future of healthcare delivery from a personal level to a system level.[9] JASON found that a confluence of three forces “primed our society to embrace new health centric approaches that may be enabled by advances in AI.”[10] First, there exists a general frustration with outdated medical systems. Second is the “ubiquity of networked smart devices in our society.”[11] And third, is “acclimation to convenience and at-home services like those provided through Amazon and others.”[12]
Wearable Health Technology
A reflection of our health centric culture lies in the explosion of wearable medical devices. Wearables are “seamlessly embedded portable computers … worn on the body.”[13] Wearable technology was first proposed in the 1960s.[14] Since then wearable technology has become entwined in people’s daily lives via smart watches, smart bracelets, armbands, glasses, vests, and more[15], [16] Wearables may motivate users to exercise, to limit calorie intake, to drink water, to monitor vitals, and to see healthcare providers when there are abnormal metrics. Even more, released in December 2018, the Apple Watch Series 4 combined the functions of ECG and a watch for the first time, allowing the watch to display an ECG to monitor occult atrial fibrillation.[17]
In addition, “portable medical or health electronic devices” have been developed for use in diagnosis and medical treatment.[18] Such devices “perceive, record, analyze, regulate, and intervene” in health-related areas and “can even be used to treat diseases with the support of various technologies for identification, sensing, connection, cloud services, and storage.”[19] Wearables can measure and assess the wearer’s gait, walking speed, posture, respiratory rate, blood oxygen, heart rate, blood pressure, energy expenditure, mood, and more. The goal is to provide “real-time, online, accurate and intelligent detection and analysis” of health data that can be used for “self-diagnosis and self-monitoring.”
Advancing technology in wearables allows them to be used in the management of chronic disease, such as cardiovascular disease, pulmonary diseases, chronic obstructive pulmonary disease (COPD) and bronchial asthma, diabetes, and hypertension. Wearables facilitate data collection and monitoring of the user’s day. For example, there is now a sports vest made from nanofibers coated with an electroconductive polymer to place the ECG electrodes in close contact with the human body.[20] The signal allows display of the ECG signal in real time, with monitoring data collected through an app and analyzed by physicians to monitor heart diseases.[21] There is even a wearable cardioverter-defibrillator for sudden cardiac arrest that can be used prior to an implantable cardioverter-defibrillator.[22]
With advances in virtual reality and remote technology, wearables have also been used in medical education, the formulation of preoperative surgery plans, intraoperative navigation, preoperative doctor-patient communication, and remote consultation. Collecting data throughout normal daily activities with a wearable may provide a richer data set than “a snapshot” reading at a clinic.[23]
Better-informed patients may be more proactive in their discussions with healthcare providers. This in turn could lead to improved decision-making discussions between a patient and healthcare provider and better patient adherence to any treatment plan.
Integration of AI and Wearable Tech
AI access to wearable tech data opens up large sets of personalized data. But is that data valid and reliable?
The validity and reliability of wearable technology has been studied.[24] These concepts are illustrated as follows: “A bathroom scale is a reliable measure of one’s weight, provided one stands still on the scale for several moments. Yet, one is likely to discard the measurement shown on the scale if one is startled by a spider during these moments.”[25] Trying to measure signals from a wearable device is similar to measuring one’s weight while dancing around on the scale. “The fidelity of the measurement will depend not only on the sensor’s accuracy, but also on the environmental conditions under which measurement was taken.”[26]
The term “fidelity” refers to validity and reliability.[27] Validity denotes measurement accuracy, which is usually determined in relationship to another gold-standard of the same variable.[28] Reliability refers to measurement precision, that is, the consistency of several measurements taken in the same conditions and/or with the same equipment.[29]
While validity of data from wearable sensors is generally thought to be good,[30] that is not necessarily true for reliability. Making things even more complex, reliability of a wearable sensor “is not fixed but varies across different contexts and circumstances.”[31]
In other words, reliability is not easy to fix. Because of that, the use of AI with data generated from wearable technology must account for the presence of unreliable data. Some wearable devices may not accurately record metrics, with the variability in performance attributed to device quality, user anatomy, and/or movement interference. For example, a loose-fitting watch may not accurately measure heart rate. Further, wearable devices may often require calibration to match an individual’s specific measurements and habits. Users may not follow setup instructions closely, resulting in data that does not truly reflect the user’s health or activity levels. Those who have worked on litigation concerning a diet drug medication in the 1990s may recall the calibration issues that surfaced with echocardiogram machines.
Reliability is also challenging because there is a lack of standardization among different manufacturers and models of wearables. This variability means that two different devices may provide two different readings for the same metric. Accordingly, before any data from a wearable device is incorporated into an AI system there must be a quality control process to assess validity and reliability of the data.
Challenges and Considerations
While AI trained on data from wearables may ultimately lead to improved patient care and reduced healthcare costs for patients, what are the risks to the healthcare provider in adopting AI tools?
Privacy issues. One factor to consider is whether using patient information to train an AI system violates the patient’s privacy or confidentiality rights. In Dinerstein v. Google, LLC,[32] the U.S. Court of Appeals for the Seventh Circuit reviewed a class action lawsuit brought by patients against a university hospital and a research partner regarding the use of anonymized electronic health records to create predictive health models. The case arose because the university hospital and their research partners sought to develop software capable of anticipating patients’ future healthcare needs. If successful, the software promised to reduce medical complications, eliminating unnecessary hospital stays, and ultimately, improving patients’ healthcare outcomes.
The first step in the research process involved the university hospital supplying its research partners with several years of anonymized patient medical records to “train” the software’s algorithms. A Data Use Agreement governed the transaction, and the agreement expressly prohibited the research company from attempting to identity any patient whose records were disclosed. Also, university hospital required patients to sign paperwork detailing its confidentiality obligations. That paperwork notified patients that their permission was not required for the university hospital to use or share information in limited research-related circumstances, but all efforts would be made to protect patient privacy. The patient was also told he/she would not be entitled to compensation.
On behalf of a putative class, plaintiff claimed invasion of privacy and breach of medical confidentiality,[33] among other claims. Plaintiff also asserted that he had a property interest in his medical records. The court dismissed this argument, finding that medical records belonged to the medical provider.[34]
The court found that plaintiff lacked standing to raise any claim. “To sue in federal court, a plaintiff must plausibly allege (and later prove) that he has suffered an injury in fact that is concrete and particularized, actual or imminence (for some combination of the three).”[35] Plaintiff failed to do so.
Accordingly, if a healthcare provider seeks to harness the data from a patient’s wearables, the patient must sign a release of that information and expressly consent to the healthcare provider’s use of the data in AI systems and for research, without additional compensation. That release must comply with the requirements of federal and applicable state laws.
Accuracy issues. Data from a patient’s wearable must also be verified. While AI systems may be a powerful tool, their accuracy and usefulness in healthcare settings is directly tied to the breadth and quality of the underlying training data. Even in a healthcare setting, AI systems may “hallucinate” or provide incorrect information. “These challenges underscore the importance of ensuring that … AI technologies are fit for the purpose to which they are ultimately applied, which includes users understanding the caveats and limitations of each tool.”[36]
Disclosure of private health information. Another risk to healthcare providers arises from the websites and patient portals they maintain. Information that may be collected from these websites and portals by third parties have led to novel litigation claims. In Brahm v. Hospital Sisters Health System,[37] a class action complaint was filed claiming that the hospital routinely disclosed patient identities and protected healthcare information from the hospital’s website through advertising technology referred to as “pixel tracking.” According to the complaint, the hospital allegedly installed tracking pixels provided by third parties, including Facebook and Google, to collect data about patient activity on its website and patient portal without their consent.
Plaintiffs asserted claims for breach of contract, conversion, breach of confidentiality, invasion of privacy, and a host of other claims. The complaint alleged that the hospital’s written privacy policies assured patients that they would not disclose personal health information without the patients’ written authorization; use or disclose sensitive personal information without patients’ express consent; and directly provide personal identifiable information to strategic partners for promotional purposes. While the Court dismissed the conversion claim, it found the contract claims were sufficiently pled. The case remains pending.
Best practices. In December 2024, the Healthcare Information and Management Systems Society (HIMSS) and Medscape published a study on AI adoption in healthcare.[38] The study participants included medical doctors, nurses, other healthcare practitioners, and IT technology professionals. This group identified data security and privacy risks as the most significant challenge with AI.
Because AI systems access large healthcare datasets, which include confidential and sensitive patient information, there is a constant risk of data breaches and unauthorized access. Protecting patient privacy via encryption, access controls and data minimization is crucial. Misuse can have devastating consequences. Additionally, patients must consent to using their personal information for medical AI. Because AI systems can be vulnerable to cyberattacks, proper security is crucial. Other factors to consider and be wary of include bias and fairness, ethical dilemmas and over reliance.
Conclusion
Harnessing AI with valid and reliable data from wearable devices presents a wealth of opportunities to enhance modern healthcare. However, strategies to minimize bias, ensure privacy, and protect the security of confidential information must be integral to the overall AI strategy.
[1] Oxford English Dictionary, definition of “Artificial Intelligence”.
[2] Naukul Aggarwal et al., Advancing Artificial Intelligence in Health Settings Outside the Hospital and Clinic, National Academy of Medicine, at 4(Nov. 30, 2020).
[3] Id. See also Tadiboina S, Benefits of Artificial Intelligence in Healthcare, Webology, Vo. 18, Nov. 5, 2021.
[4] Congressional Research Service, Artificial Intelligence (AI) in Health Care (Dec. 20, 2024), chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://crsreports.congress.gov/product/pdf/R/R48319.
[5] Id.
[6] Id.
[7] Id.
[8] Id. (internal citations omitted).
[9] U.S. Dept. Health and Human Services, Artificial Intelligence for Health and Health Care (Dec. 2017).
[10] Id.
[11] Id.
[12] Id.
[13] Dehghani M, Kim K, Dangelico R. Will smartwatches last? factors contributing to intention to keep using smart wearable technology. Telematics Informatics. 2018 May;35(2):480–90. doi: 10.1016/j.tele.2018.01.007. doi: 10.1016/j.tele.2018.01.007.
[14] Lin Lu, et al., Wearable Health Devices in Health Care: Narrative Systematic Review, JMIR MHealth Uhealth, 2020 Nov 9; 8(11), doi 10.2196/18907.
[15] Id.
[16] Kang HS, Exworthy M. Wearing the Future-Wearables to Empower Users to Take Greater Responsibility for Their Health and Care: Scoping Review. JMIR Mhealth Uhealth. 2022 Jul 13;10(7):e35684. doi: 10.2196/35684. PMID: 35830222; PMCID: PMC9330198.
[17] Ip JE. Wearable devices for cardiac rhythm diagnosis and management. JAMA. 2019 Jan 29;321(4):337–338. doi: 10.1001/jama.2018.20437. https://jamanetwork.com/journals/jama/fullarticle/2721089. [DOI] [PubMed] [Google Scholar].
[18] Lu et al., supra, at n. 14.
[19] Id.
[20] Tsukada YT, Tokita M, Murata H, Hirasawa Y, Yodogawa K, Iwasaki Y, Asai K, Shimizu W, Kasai N, Nakashima H, Tsukada S. Validation of wearable textile electrodes for ECG monitoring. Heart Vessels. 2019 Jul;34(7):1203–1211. doi: 10.1007/s00380-019-01347-8. http://europepmc.org/abstract/MED/30680493. [DOI] [PMC free article] [PubMed] [Google Scholar]
[21] Id.
[22] Kaspar G, Sanam K, Gholkar G, Bianco NR, Szymkiewicz S, Shah D. Long-term use of the wearable cardioverter defibrillator in patients with explanted ICD. Int J Cardiol. 2018 Dec 01;272:179–184. doi: 10.1016/j.ijcard.2018.08.017. https://www.sciencedirect.com/science/article/pii/S0167527318329541?via%3Dihub. [DOI] [PubMed] [Google Scholar].
[23] Kang, et al., supra, at n.16.
[24] Dudarev V, Barral O, Zhang C, Davis G, Enns JT. On the Reliability of Wearable Technology: A Tutorial on Measuring Heart Rate and Heart Rate Variability in the Wild. Sensors (Basel). 2023 Jun 24;23(13):5863. doi: 10.3390/s23135863. PMID: 37447713; PMCID: PMC10346338.
[25] Id.
[26] Id.
[27] Id.
[28] Id.
[29] Id.
[30] Barrios L., Oldrati P., Santini S., Lutterotti A. Evaluating the Accuracy of Heart Rate Sensors Based on Photoplethysmography for in-the-Wild Analysis; Proceedings of the 13th EAI International Conference on Pervasive Computing Technologies for Healthcare; Trento, Italy. 20–23 May 2019; pp. 251–261. [Google Scholar]; Hernando D., Roca S., Sancho J., Alesanco Á., Bailón R. Validation of the Apple Watch for Heart Rate Variability Measurements during Relax and Mental Stress in Healthy Subjects. Sensors. 2018;18:2619. doi: 10.3390/s18082619. [DOI] [PMC free article] [PubMed] [Google Scholar]; Kinnunen H., Rantanen A., Kenttä T., Koskimäki H. Feasible Assessment of Recovery and Cardiovascular Health: Accuracy of Nocturnal HR and HRV Assessed via Ring PPG in Comparison to Medical Grade ECG. Physiol. Meas. 2020;41:04NT01. doi: 10.1088/1361-6579/ab840a. [DOI] [PubMed] [Google Scholar]; Menghini L., Gianfranchi E., Cellini N., Patron E., Tagliabue M., Sarlo M. Stressing the Accuracy: Wrist-worn Wearable Sensor Validation over Different Conditions. Psychophysiology. 2019;56:e13441. doi: 10.1111/psyp.13441. [DOI] [PubMed] [Google Scholar]; Steinberg B.A., Yuceege M., Mutlu M., Korkmaz M.H., van Mourik R.A., Dur O., Chelu M., Marrouche N. Utility of a Wristband Device as a Portable Screening Tool for Obstructive Sleep Apnea. Circulation. 2017;136:A19059. [Google Scholar]; Dur O., Rhoades C., Ng M.S., Elsayed R., van Mourik R., Majmudar M.D. Design Rationale and Performance Evaluation of the Wavelet Health Wristband: Benchtop Validation of a Wrist-Worn Physiological Signal Recorder. JMIR Mhealth Uhealth. 2018;6:e11040. doi: 10.2196/11040. [DOI] [PMC free article] [PubMed] [Google Scholar].
[31] Dudarev, et al., supra, at n. 31.
[32] 73 F.4th 502 (7th Cir. 2023).
[33] Illinois does not recognize a cause of action for breach of medical confidentiality, while other states do. See, e.g., Byrne v. Avery Center for Obstetrics and Gynecology, P.C., 175 A3d 1, 7-17 (Conn. 2018) (as a matter of first impression, unauthorized disclosure of confidential information obtained in the course of the physician-patient relationship gave rise to a cause of action sounding in tort); Biddle v. Warren General Hospital, 86 Ohio St.3d 395, 401, 715 N.E.2d 518 (Ohio 1999) (“[w]e hold that in Ohio, an independent tort exists for the unauthorized, unprivileged disclosure to a third party of nonpublic medical information that a physician or hospital has learned within a physician-patient relationship”); McCormick v. England, 494 S.E.2d 431 (S.C. Ct. App. 1997) (“we hold South Carolina should recognize the common law tort of breach of a physician’s duty of confidentiality”); Alberts v. Devine, 479 N.E.2d 113 (“a patient can recover damages if the physicians violates the duty of confidentiality that plays such a vital role in the physician-patient relationship”), cert. denied sub nom, Carroll v. Alberts, 474 U.S. 1013 (1985).
[34] Dinerstein, 73 F.4th at 518 (citing Young v. Murphy, 90 F.3d 1225, 1236 (7th Cir. 1996)).
[35] Id. at 508.
[36] Congressional Research Service, Artificial Intelligence (AI) in Health Care (Dec. 20, 2024), chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://crsreports.congress.gov/product/pdf/R/R48319.
[37] Brahm v. Hospital Sisters Health System, 23-cv-444, slip op., 2024 WL 3226135(W.D. Wis. June 28, 2024).
[38] HIMSS and Medscape, AI Adoption in Healthcare Report 2024,
Finis