top of page

Wearable Health Devices: The Exchange of Privacy for Wellness

  • Writer: Society of Bioethics and Medicine
    Society of Bioethics and Medicine
  • Dec 8, 2025
  • 3 min read

Writer: Manaal Khawer

Editor: Kimberly Arinton


Since the COVID-19 pandemic, there has been a significant rise in concern for wellness and health. The culprit could be the emergence of fitness influencers or overpriced consumerist lifestyles (Alo, Pilates, etc.). Aside from these disposable trends, technology has allowed us to supposedly improve wellness at our fingertips, making it ever so accessible. From Apple Watches tracking our daily steps, to Oura rings analyzing sleep patterns, these devices create personalized health insights while selling the promise of an enhanced well-being. The collection of such intimate information poses the question: how is this data being used, and who owns this information?


A survey published in 2018 showed that more than 80% of users were unaware of how their biometric data was being collected or shared. Consumers tend to underestimate the power of data collection by assuming that devices like smartwatches cannot store sensitive data because they lack keyboards. They believed they had nothing to hide in terms of daily activity data collected from these devices [1]. In reality, these hyper-specific sensors can detect highly personal details such as emotional states, daily routines, and even stress levels. While many companies do disclose that health data can be shared with third parties, including insurance providers and advertisers, this information is often hidden away in Terms and Conditions and Privacy Policies that the average consumer does not read. For example, in 2022, Fitbit’s Privacy Policy stated that even anonymous data could be shared for research purposes [2]. Nonetheless, studies have shown that even anonymous health datasets can contain enough information to identify individuals with alarming accuracy [3]. This raises ethical concerns about informed consent as consumers are not fully aware of the extent to which their data is being shared.


The convenience of these devices could also bring the risk of micromanaging, self-surveillance, and self-image issues. Although Oura Rings and Apple Watches promote mindfulness and self-care, they impose a threshold for users to attain on a daily basis, which could become unrealistic and harmful. A 2022 article in The Lancet Digital Health warns that health-tracking devices may foster a culture of “quantified self,” where people’s worth and health are reduced to metrics such as steps, hours of sleep, or calories burned [4]. This blurs the line between bodily empowerment and pressure, making people feel conditioned and confined by mere numbers.


Equity becomes another issue difficult to tackle. Algorithms in these devices have been shown to not be uniformly accurate across different skin tones and body compositions [5]. For example, photoplethysmography (PPG), a light-based sensor method used in heart rate tracking, is less reliable on darker skin. Because melanin scatters and absorbs light, light-based sensors would not show the same readings on different skin tones, which could potentially lead to misreadings or inaccurate health alerts. This extrapolates a larger series of issues in medical technology where biases in data and design can lead to unequal outcomes and may even be detrimental to those who are extremely reliant on the numbers.


Health-tracking devices should be integrated into society ethically, supported by stronger data protection laws, algorithmic transparency, and accessibility measures. Governments and health institutions must establish clearer regulations on how companies collect and utilize data. Consumers should also be well-versed in knowledge about data rights and informed consent. These devices promise a future where we can monitor and improve our well-being in real time. However, we must balance innovation with privacy to prevent ourselves from being reduced to mere data-harvesting sources. The purpose of these devices is well understood to make us healthier, but do they compromise our freedom?


References

  1. Datta, P., Namin, A. S., & Chatterjee, M. (2018). A Survey of Privacy Concerns in Wearable Devices. Department of Computer Science, Texas Tech University. IEEE International Conference on Big Data.

  2. Fitbit Legal - Privacy Policy , 2020

  3. Rocher, L., Hendrickx, J.M., & de Montjoye, Y.A. “Estimating the Success of Re-identifications in Incomplete Datasets Using Generative Models.” Nature Communications, 2019.

  4. Lupton, D. “Data Selves: More-than-Human Perspectives.” The Lancet Digital Health, 2022.

  5. Bent, B., et al. “Investigating Sources of Inaccuracy in Wearable Optical Heart Rate Sensors.” NPJ Digital Medicine, 2020.

 
 
 

Comments


Post: Blog2_Post

Subscribe Form

Thanks for Submitting!

  • Google Places
  • Facebook
  • Instagram

_____________________________________________________________________________________________________________________________________________________________

Copyright © 2024 Society of Bioethics and Medicine. All rights reserved.                Contact Us | Privacy Policy | Terms of Use                                                                                                                United States                    

bottom of page