Facebook ends test of suggested responses on livestreams

Facebook
Facebook
By Brandy Zadrozny with NBC News Tech and Science News
Share this articleComments
Share this articleClose Button

The suggested comments were not the first time Facebook has been criticized for features rolled out during tragedies.

ADVERTISEMENT

Facebook tested a new feature on Monday that provided users with suggested responses to live videos — a test that quickly went downhill due to the sensitive nature of some of the livestreams.

The new feature appeared on NBC News' livestreamed coverage of a shooting at a Chicago hospital that left four dead. The comments section on the Facebook livestream offered several clickable suggested responses including, "This is so sad," "Heartbreaking" and an emoji of hands clasped in prayer, among others. The feature was first noticed by Steph Haberman, a freelance social media editor for MSNBC and NBC Politics.

Facebook said on Tuesday that the feature was a test, which it has since stopped.

"We have been testing a suggested comment feature on Live videos," a Facebook spokesperson said in an email. "Clearly this wasn't implemented properly and we have disabled this feature for now."

Suggested comments are no longer available on Facebook's live videos, but before the feature was switched off, it appeared on other livestreams, including pages belonging to local news outlets, shopping channels and gamers, BuzzFeed News reported.

Attached to a local news video about a sexual assault and a shooting in St. Louis, BuzzFeed reported Facebook offered suggestions such as "respect," "take care," and a wide-eyed, raised-eyebrows emoji. On another NBC News story about the Chicago shooting, Facebook suggested a kissing face emoji, BuzzFeed reported.

The suggested comments were not the first time Facebook has been criticized for features rolled out during tragedies. In 2017, the platform stopped promoting solidarity filters — translucent flags that would cover a user's profile to show compassion for the country where an attack had occurred — after criticism that the feature favored Western countries. The stoppage was in response to backlash over Facebook's offering of solidarity filters after a 2015 Paris attack that left 130 dead, but not to Beirut, where an attack the same week killed 43.

Facebook's Safety Check — the feature that allowed users in an area hit by a natural disaster or terror attack to signal their safety to friends — faced similar criticism for activating during emergencies in Paris, Brussels and Orlando, but not in Iraq. In 2016, Facebook made the Safety Check feature automated.

Facebook has also had to apologize to users for algorithms that invited some users to relive their most painful memories. Just before Christmas in 2014, the company surfaced houses on fire and dead relatives for some users in a feature known as Year in Review.

For many on Twitter, Facebook's suggested comments feature experiment felt like another failure in automated responses to national tragedies.

"This wins for most dystopian thing I've seen all day (and I live in the smoke-drenched Bay Area where everyone is wearing masks, so that's saying a lot)," tweeted one user.

"So glad we've automated the process of having thoughts and feelings about that, because doing it manually was f****** exhausting," tweeted another.

But the Facebook experiment may also be a natural next step for automated responses that have become commonplace on social media and personal tech.

Google first rolled out its suggested responses for Gmail in May 2017. Called "Smart Reply," the company touted the feature as a time saver by offering users three responses based off the content received. The system learns from a user's responses, slowly tailoring itself over time to a person's linguistic cues.

After a two-year rollout, "Smarty Reply" became a default feature from Google last month. Early hiccups (the system used to favor "I love you," as a reply to most emails), were tweaked by Google's developers, and now some 10 percent of Gmail replies contain the suggestions, according the the company.

And it's not just Google. LinkedIn now offers suggested replies for its InMail; Instagram suggests emojis in the comment field; and Apple has been predicting text replies via its QuickType keyboard since 2014.

Sherry Turkle, founding director of MIT's Initiative on Technology and Self and author of several books including "Alone Together: Why We Expect More from Technology and Less from Each Other," said Facebook's suggested comment feature was the Hallmark card of our times.

"We expect our technology to give us a prepackaged responses to devastation, to carnage, to tragedy. Not just emojis now, but words," Turkle said, citing the comment feature as well as Facebook's suggested responses to users' birthdays and deaths of loved ones.

ADVERTISEMENT

"Digital technology wants to make the world friction-free," Turkle said. 'But life is full of friction. To just hit the button, or the emoji, starts to make the response friction-free and that's why it's disturbing. Not because of the lack of creativity in the words, but because so little is asked of us. You don't have to think at all about what happened."

Share this articleComments

You might also like

Apple launches faster chips, MacBook Pro laptops and cheaper Airpods - what are the upgrades?

What is the metaverse and why is Facebook betting big on it?

Euronews Debates | Profit vs public good: How can innovation benefit everyone?