InkNest
InkNest Engine
🔒 On‑screen reading only. Exports and bulk downloads require InkNest Plus.

Wearable Health Tech, Data Privacy, and Surveillance Capitalism

C. D. Morgan & I. A. Suleiman — Digital Health & Society. DOI: 10.9901/dhs.2026.0458
Abstract

Wearable devices collect biometric data that are increasingly monetized, raising privacy and equity concerns for marginalized users. This paper maps data flows from consumer wearables to insurers, employers, and analytics vendors through privacy policy audits, vendor documentation, and interviews with users and privacy experts. We assess re‑identification risks, secondary uses for underwriting and employment decisions, and differential impacts on populations with limited bargaining power. The analysis highlights gaps in consent, transparency, and regulatory protections and proposes policy options—limits on secondary use, stronger data minimization, and user control mechanisms—to protect privacy and prevent surveillance harms that disproportionately affect vulnerable groups.

Introduction

Wearable health technologies promise personal insights but also create data flows that can be repurposed for underwriting, employment screening, and targeted marketing. These secondary uses raise privacy and equity concerns, particularly for marginalized populations with limited bargaining power.

Methods

We audited privacy policies of major wearable vendors, traced data flows through vendor documentation, and interviewed 28 users and 12 privacy experts. We assessed re‑identification risk and potential secondary uses and evaluated regulatory gaps.

Results

Many vendors permit broad secondary uses and data sharing with third parties. Re‑identification risks remain nontrivial, and employers/insurers are increasingly interested in biometric data for monitoring and underwriting. Users expressed concern about opaque data practices and limited control.

Discussion

Policy options include limits on secondary use for underwriting/employment, stronger data minimization and deletion requirements, and user control mechanisms. Protections should prioritize equity and prevent surveillance harms for vulnerable groups.

References