Project Polar: why we decided to publish our findings
How do you reveal grave risks that people need to know about, when doing so will expose data that endangers people’s lives?
Today we published our articles on Polar. In them, we reveal how easy it is to use the company’s fitness app to find the names and home addresses of intelligence agents, people who work at sites where nuclear weapons are stored, and military personnel on assignment. Not just in the Netherlands, but worldwide.
This information is vital to the public interest. Our investigation exposes an immense and existing risk, namely that Polar and other fitness apps can unwittingly reveal the names and addresses of people who absolutely should not be identified. This can endanger the lives of soldiers and their families, compromise military operations, and jeopardize national security. These people truly do have something to hide.
This is information anyone can find and abuse. It may already have happened.
We know: making this public can give bad people bad ideas
Throughout our investigation, we’ve been acutely aware of the responsibility we bear. Sharing what we’ve learned, and exactly how we learned it, can give people dangerous ideas. But that shouldn’t keep us from publishing our findings. It’s precisely the extent of the problem – there are dozens of similar apps, millions of users, and countless people and organizations at risk – that requires this information be revealed.
We hope that publishing this information will improve awareness about this kind of technology everywhere: among users, governments, and companies.
We hope to give users a wakeup call. To make them take a good, hard look at their privacy settings and how they use apps. Our investigation reveals that frighteningly few people are aware of the risks. Even intelligence and military personnel are uninformed. Can you, your family, and your employer afford for you to be so careless with your location data?
We hope to give governments and employers a wakeup call. Do they have any idea what security measures are being nullified by sloppy app behavior? Are existing protocols sufficiently monitored, existing rules sufficiently enforced?
Our investigation reveals that the answer is no. Despite the Strava incident earlier this year, when the American fitness app accidentally revealed the locations of military bases, we still managed to identify intelligence and military personnel through their use of Strava.
And we hope to give technology companies a wakeup call. The apps they make are fantastic, but they haven’t put enough thought into the sensitivity of the data they collect, they overestimate their users’ awareness of security, and they’re hard to interest in making privacy a priority. Don’t they realize that a purely technological or legal approach to their service isn’t enough?
How we protected the collected data
At the start of our investigation, we conducted an extended threat analysis with our security specialists. Our major concern was the safety of the people we identified. Under no circumstances could the data we collected on them be allowed to fall into the wrong hands.
From the start, we handled the information with the greatest possible care. We limited the number of people with access to the data to an absolute minimum. We used standard encryption to store the data on high-security storage devices that were not connected to the Internet.
All our communication regarding the project ran through secure channels. At no time did we use the cloud to store or share sensitive information. During our investigation, we ramped up our systems monitoring for unauthorized access and vulnerabilities.
Last week, as the publication date was approaching, we began to share increasing information about the project with our editorial colleagues. No data on the identified individuals was shared with them. We hashed (encrypted) all identifying data before sharing it with our infographic designer.
We have not shared any identifying information with the other people and organizations with whom we’ve had contact, from the Dutch Ministry of Defense to journalists at other publications.
Last Tuesday, we permanently deleted all the data from our secured data storage devices.
How we shared our findings with others
It was important to inform all the affected organizations as quickly as possible, so they had enough time to take corrective action. This was a massive task, given the organizations that needed to respond: national defense departments around the world, intelligence agencies, Polar and other app makers, and other government divisions.
We informed the Dutch Ministry of Defense first, on Friday, June 22. The following Monday, we presented our findings in detail at the ministry.
We wanted to give the ministry plenty of time to ensure that affected employees – such as intelligence operatives and deployed soldiers – would not be placed in danger when we published.
We also asked the ministry to help us contact its counterpart in other countries that had surfaced in our investigation. And we asked the ministry to exert pressure on Polar, so the Finnish company would act to fix the issue.
The ministry took immediate action
The Dutch Ministry of Defense instantly took several protective measures. It informed its employees of the risks associated with apps like these. It disabled the use of these apps on the telephones it provides to its employees. And there are plans to altogether prohibit the use of this kind of technology by specific high-risk groups under the ministry’s command, such as intelligence operatives and military personnel on assignment.
What’s more, the day we published our story the ministry sent all its employees a push notice specifically warning them not to use Polar. It also contacted foreign defense departments whose employees were identified by our investigation, as well as other government agencies. The ministry also approached Polar.
In our opinion, the ministry has responded effectively to the results of our investigation. We’ve been in almost daily contact with the ministry to keep each other informed as things progress.
But Polar seemed rather blasé
On June 25, we contacted Polar in Finland and presented our findings. We also gave the company a detailed account of the specific flaws we exploited in its app, and we suggested ways it might fix those flaws. We also asked the company to hire an independent advisor to scan and test Polar’s systems.
Communication got off to a rocky start. Two of Polar’s engineers served as our first liaison. They didn’t give us the sense that Polar recognized the severity of the situation. The company initially told us it would make only a few minimal changes to its website and the app.
We then contacted a Finnish journalist we know and trust, Hanna Nikkanen at Long Play. She ramped up the pressure on Polar and made sure its senior management was informed of the issue.
None of that produced the desired reaction. According to Marco Suvilaakso, Polar’s chief strategy officer, Polar is in full compliance with the new European privacy law and it’s the user’s responsibility to decide which data they publicly share.
Meanwhile, we asked Nikkanen and the Dutch Ministry of Defense to contact the Finnish Ministry of Defense. An American journalist we often work with, Zack Whittaker at CBS News, simultaneously exerted pressure on Polar’s representatives in the US.
The screen finally goes dark
At last, on Wednesday, July 4, the company informed us it had decided to disable the map on its website after all. That means it’s no longer possible to view workouts and trace identities, eliminating any risk to users from the publication of our piece.
Last Tuesday we also began alerting other technology companies: Endomondo, Runkeeper, and Strava.
The issue at these companies is slightly different from the one at Polar. The specifics of the Polar flaw can’t be used to ferret out sensitive information in the other five apps.
Translated from Dutch by Grayson Morris and Rufus Kain. The original articles in Dutch can be found here.
All our coverage in English can be found below.