I’m an average runner, in every sense of the word. I don’t run at incredible speeds, I always do the same loop, and I always use the same app to track my runs. I’m not a secret agent, and I don’t work in the criminal underworld, so I don’t think my runs are of interest to anyone except those who follow me and can see two fundamental facts: how repetitive I am and how slow I am. In short: I’m incredibly boring.
Perhaps this leads me to think that Strava (or other tracking apps) are harmless and neutral tools. And they have been, at least until they introduced social network features that allowed (for excellent reasons) users to share routes, compare themselves, and even meet and organize training sessions together.
However, there are less noble uses for these apps, and an article from the Global Investigative Journalism Network discusses them in detailed and unsettling ways.
When Innocence Becomes Vulnerability
This isn’t science fiction; it’s something we’ve already seen happen. In 2018, we had already reported that Strava’s Global Heatmap had inadvertently revealed the location and layout of secret military bases. How? Simple: soldiers training by running within military perimeters, by tracking their activities, had effectively drawn detailed maps of structures that were supposed to remain invisible. In desert areas where theoretically no one should have been, running routes appeared, complete with the soldier’s first and last name.
The more recent story concerns Russian military personnel in conflict zones. In 2023, a team of investigative journalists discovered that several military officials were using Strava with public settings active, allowing reporters to map identities, training bases, and daily routines.
But perhaps the most significant example for us common runners is that of Molly Seidel, Olympic marathon bronze medalist at Tokyo 2020. After years of total transparency with her 68,000 followers on Strava, she had to make her profile private. The reason? Unpleasant situations related to sharing her training routes. When you’re a public figure, even your morning run can become information too sensitive to share.
There’s an even more disturbing aspect: the GIJN article reveals how Strava data has been used to plan targeted attacks on specific runners. We’re not talking about random crimes, but actions meticulously planned based on the victims’ running habits. The routine that is comfort and regularity for us can become an opportunity for predation for someone else.
From Social to Surveilled
The paradox is evident: the more precise an app is at tracking our performance, the more effective it becomes for those with intentions other than sport. Strava’s aforementioned “heatmap” feature, for example, shows the most frequented routes in an area. This is a fantastic tool for discovering new trails and frequenting them hoping to meet new people (something that can be nice if you’ve moved to an unfamiliar city), but also for understanding where and when a specific person habitually trains.
Imagine someone who, instead of physically stalking you, simply consults your public data on Strava and discovers that you run every Tuesday and Thursday at 7:00 AM, always in the same park. All valuable information obtained without even leaving home.
Technology: Friend or Foe?
We don’t want to demonize technology or create unnecessary panic. However, the journalistic use of this data has brought important truths to light, exposing lies and contributing to transparency, which is a good thing. Not so much because it provided ill-intentioned individuals with ideas on how to exploit the unexpected consequences of certain features, but because it showed how to effectively protect one’s privacy, even when tracking one’s workouts.
The problem arises, in short, when the same tools are used with malicious, and sometimes even criminal, intentions.
The reality is that every run we take leaves a digital trace. Like a modern-day Tom Thumb, we scatter electronic breadcrumbs that remain imprinted on the servers of some tech company. And these breadcrumbs tell stories: where we go, when we go out, what our most intimate habits are.
Awareness as the First Step
The central question isn’t “should we stop using these apps?”, but rather “how can we use them consciously?”. Because technology, like a kitchen knife, can be a tool for creativity or harm, depending on how we use it.
Being connected in 2025 means existing even when we don’t know it. Every step, every kilometer, every heartbeat becomes part of a digital narrative that can be of interest to people we can’t even imagine. Without getting paranoid (“What will my Netflix habits say about me?”, “I knew joining that taxidermy group would give a distorted idea of me!”).
Protecting Your Run
Fortunately, Strava is aware of these risks and has implemented several privacy features that we can activate. It’s important to know that they exist and to use them according to our needs. I, for example, have chosen to obscure the start and end points of my runs—a simple but effective precaution to avoid revealing where I live or work. I know very well that no one cares, but you never know.
You can make your activities private, create privacy zones around your home, limit who can see your routes, and even hide specific sections of your workouts. The app provides the tools, but the final decision on the degree of protection always rests with you.
Running Free, But Aware
Running should remain a moment of freedom, not turn into a source of anxiety. But being aware of the risks allows us to continue doing what we love, while protecting our privacy and security.
The next time you record your workout, take a moment to reflect: not out of fear, but out of awareness. Because in an increasingly connected world, your digital footprint can tell much more than you imagine. And it’s up to you to decide what story you want to share with the world.