Strava fitness tracking app reveals location of military bases

When Strava released its activity heatmap last year, it was quite proud of how many people have used the app to record their jogs and runs. It boasted of having 6 times more data, over a billion activities, and having the largest, richest, and most beautiful data set of its kind. Unfortunately, that is now landing it in a bit of a security controversy involving the world's most secretive organizations: military forces. Because in tracking users' routes and locations, Strava may have inadvertently also revealed the location of some of the army bases around the world, especially some secret ones from the US.

To be fair, Strava isn't exactly at fault. Its technology is just doing what it's supposed to do, helping users keep track of their activities, specifically exercise like running and jogging. Those not only include data such as time and number of steps but, for GPS-equipped devices, even their location and routes. But when those users include military personnel stationed in bases, and such users aren't at all aware of the implications, then it becomes a rather worrying mess for governments.

Strava's heatmaps, for example, show markers in Syria and Russia for known US bases. And it isn't just the US either. Thousands of British military personnel have also contributed to "lighting up" areas like RAF Mount Pleasant in the Falkland Islands, Lake Macphee and Gull Island Pond, just to name a few.

Perhaps even more worrying than just divulging location is that these heatmaps can also reveal the bases' layouts. Although we pretty much have satellite mapping to thank for showing some of those areas, the biking or running routes of Strava users also reveal roads and pathways. Sensitive information that might be useful for those with less than innocent purposes.

On the other hand, Strava could have also been more careful in releasing such information. They probably did some auditing of the maps to pick out the highlights so they could have spent some time to make sure sensitive data isn't included. And if that weren't feasible because of a lack of human eyes, it might be a good opportunity to bring in the machine learning technologies very much hyped these days.

SOURCE: The Guardian