Tech companies lost more credibility in the eyes of the public last year than at any point in the digital era, thanks to a string of high-profile incidents involving user privacy and the mishandling of data. It should be clear that user trust is indeed fragile, even for companies like Facebook that have staked out a fundamental role in public life. The site’s traffic dropped almost 50 percent between 2016 and mid-2018.

Apple’s latest privacy bug, disclosed this week, shows mixed results as to whether these companies are bringing substantive change to their handling of user privacy.

On Monday, 9to5Mac disclosed a disturbing security bug in Apple’s FaceTime app – a bug that may have been in place for as long as three months. FaceTime callers were able to eavesdrop on those they had called, even if the intended call recipients had not accepted the calls. In some instances, callers may have been able to access video as well as audio.

First, Apple acknowledged the bug and said that a fix was on its way later this week – leaving an exploitation window wide open in the meantime.

Shortly after that statement, Apple went further, and said it was disabling the Group FaceTime feature altogether on its own servers. This was a necessary step, given that the bug allowed users to easily spy on other users without their knowledge. In doing so, Apple clearly acknowledged that user privacy is a key PR issue and crucial for maintaining the public’s trust in their devices.

But unfortunately, that’s not the end of the story.

A user’s tweets from as early as January 20th tagged Apple Support, and then Apple CEO Tim Cook, alerting them that the user’s teenage son had discovered the exact same bug that Apple is only now responding to after journalists called attention to it this week. CNET has since identified the user as Michelle Thompson.

The first tweet, on the 20th, read:

“My teen found a major security flaw in Apple’s new iOS. He can listen in to your iPhone/iPad without your approval. I have video. Submitted bug report to @AppleSupport…waiting to hear back to provide details. Scary stuff!”

And then again the next day:

“@tim_cook This is real…trying to get Apple’s attention to get this addressed. I’m just a mom of a teenager who found a huge problem in your new update. I’ve verified it myself…someone from Apple should respond to us.”

After no response to the tweets, Thompson tried emails and fax, describing the bug in detail, including a link to a YouTube video demonstrating the problem. “My fear is that this flaw could be used for nefarious purposes,” Thompson said in a message. “At this point, I will not release this information to anyone until I hear back from you.”

This eventually led to a response from Apple support, which inexplicably asked Thompson to sign up for an Apple Developer account to report the bug, even after Thompson described herself as not very tech-savvy.

For such a dramatic breach of privacy, Apple’s response was slow, insufficient, and puzzling: Apple either knew about the bug a week before it took action, or failed to give the user’s reports proper attention. Either situation shows a lack of commitment to user security.To make matters worse, Apple is a company that goes out of its way to promote the importance of data privacy, even using it as a selling point. One Apple ad boasted specifically, “What happens on your iPhone, stays on your iPhone.”

Last year, Tim Cook made an impassioned speech at a privacy conference in Brussels, not only  restating Apple’s commitments to privacy, but forcefully describing a “data-industrial complex” in which user data is “weaponized against us with military efficiency.”

“Rogue actors and even governments have taken advantage of user trust to deepen divisions, incite violence, and even undermine our shared sense of what is true and what is false. This crisis is real. It is not imagined, or exaggerated, or crazy,” he said.

For a company that represents itself this way, there is little room for errors that give bad actors opportunities to invade privacy. It is concerning not only that Apple likely knew about the issue for a week or so without taking any action, but that an easily exploited privacy bug of this magnitude got past product testers.

 

 

 

 

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.