AI Health Coaches: Empowering Fitness with Data, but at What Cost to Privacy?

New Wave of AI Health Coaches Raises Concerns about Data Privacy

Fitbit made headlines recently by unveiling its AI health coach, Fitbit Labs, set to debut in 2024. This service aims to leverage the wealth of data collected by Fitbit devices to provide users with deeper insights through the power of artificial intelligence.

One key feature of Fitbit Labs is the introduction of a chatbot within the Fitbit app. Users will be able to ask questions like, “What can you tell me about my last run?” The Fitbit Labs chatbot will then deliver information, including details like pace, distance, and time, along with more advanced analysis, such as reasons behind a less successful run compared to previous similar ones.

The extensive data gathered by leading wearable devices, including Fitbits, Apple Watches, WHOOP bands, Garmins, Pixel Watches, and others, covers a wide range of health metrics. These devices monitor stress levels, heart rate, body mass, sleep quality, exercise habits, and more. In many cases, these devices collect more data about you than your personal trainer or doctor.

To access this data, users typically dive into their device’s companion app to find detailed graphs and activity breakdowns, often presented as numerical scores. The introduction of a conversational AI, like the Fitbit chatbot, that can readily provide insights on your health data is a logical step forward.

Apple has also ventured into this territory with the Apple Watch Series 9 and Apple Watch Ultra 2, enabling users to ask Siri about their health data. While not a full coaching service, Siri can handle “fitness-related queries” with on-device processing, eliminating the need to send user data to external servers.

WHOOP, on the other hand, introduced WHOOP Coach, a service designed to be a personal trainer in your pocket. It utilizes OpenAI’s GPT-4 combined with WHOOP’s algorithms to provide answers to questions about sleep, fitness, and more while emphasizing user privacy.

While these AI health coaches seem beneficial, they raise questions about data privacy. These chatbots generate answers based on a vast amount of training data, and at some point, user health data becomes part of a statistical model. While many companies stress anonymization and privacy protection, the potential for data misuse or access by governments and police is a concern.

As AI health coach services continue to evolve, users should be aware of how their data is treated, whether it’s stored, shared with third parties, and if there are guarantees against its use for advertising purposes. These services can offer valuable insights into health and fitness, but safeguarding privacy is equally vital in this era of data-driven technology.