Privacy-First Fitness: Why Your Health Data Should Stay on Your Device

When you install a fitness app, you're handing over some of the most intimate data about yourself: how much you move, how well you sleep, your heart rate patterns, your weight history, where you walk and run. This isn't browsing history — it's a detailed map of your physical health.
Most people don't think twice about granting these permissions. But what happens to that data after you tap "Allow" should concern you.
What Most Fitness Apps Do With Your Data
The business model of most free fitness apps is straightforward: your data is the product. A 2023 study by the Mozilla Foundation reviewed 25 popular fitness and health apps and found that 80% of them shared user data with third parties, including advertisers and data brokers.
- Activity data gets packaged and sold to insurance companies who use it to adjust premiums
- Location data from runs and walks gets sold to data brokers who build movement profiles
- Health metrics get shared with advertising networks for targeted health-related ads
- Even "anonymized" data can often be re-identified when combined with other datasets
This isn't hypothetical. In 2018, the fitness app Strava accidentally revealed the locations of secret military bases through its global heatmap feature. In 2021, Flo Health settled with the FTC after sharing users' pregnancy and fertility data with Facebook and Google despite promising not to.
Why Health Data Privacy Matters More Than You Think
Health data is uniquely sensitive because it can't be changed. You can get a new credit card number after a breach, but you can't change your health history. Once that data is out there, it's permanent.
The risks are real and growing:
- Insurance discrimination — health data can be used to deny coverage or increase premiums
- Employment decisions — some employers use wellness program data in ways employees don't expect
- Data breaches — health records are worth 10-40x more than credit card numbers on the black market
- Behavioral profiling — aggregated health data creates detailed profiles that follow you across services
Apple Health: The Privacy Standard
Apple built HealthKit with a fundamentally different approach. Your health data is encrypted on your device and in your personal iCloud account. Apple can't read it. Apps that access HealthKit must request specific permissions, and you control exactly what each app can see. No data leaves your device unless you explicitly allow it.
“Your health data is encrypted and only accessible with your passcode, Touch ID, or Face ID. When your phone is locked, all health data in the Health app is encrypted.”
— Apple HealthKit Documentation
This is the foundation that privacy-first fitness apps should build on — not replace.
How We Built MySteps and Rings+ Without Collecting Your Data
When we set out to build MySteps and Rings+, we made a decision that shaped every technical choice: we would never collect, store, or transmit user health data. Not anonymized data. Not aggregated data. Nothing.
Here's what that means in practice:
- No accounts required — you never give us your email, name, or any personal information
- No servers — there's no backend infrastructure collecting or processing your data
- No analytics — we don't track how you use the app, what features you tap, or when you open it
- No crash reporting — we don't even collect error data from your device
- On-device AI — the AI coach in Rings+ runs entirely on your iPhone using Apple's Foundation Models
- Optional iCloud sync — if you want to sync between devices, it goes to your private iCloud, not our servers
Both apps read from Apple Health to display your steps, activity rings, and metrics. That data stays exactly where Apple put it — on your device.
What to Look for in a Privacy-Respecting Fitness App
If privacy matters to you (and it should), here's what to check before installing any fitness app:
- Check the App Store privacy label — Apple requires apps to disclose what data they collect and whether it's linked to your identity
- Read the privacy policy — look for language about "sharing with partners" or "improving our services" which often means data sharing
- Look for account requirements — if an app requires you to create an account just to track steps, ask why they need that information
- Check for on-device processing — features like AI coaching should run locally, not on remote servers
- Look for a clear business model — if the app is free and has no premium tier, your data is likely the revenue source
Your Steps, Your Data, Your Choice
Fitness tracking should help you move more and feel better — not feed a data economy you didn't sign up for. The technology exists to build powerful, beautiful fitness apps that keep everything on your device. We know because we built two of them.
Track your steps or close your rings — completely private, completely free.