Home BUSINESS NEWS As algorithms take over, YouTube’s recommendations highlight a human problem

As algorithms take over, YouTube’s recommendations highlight a human problem

19
SHARE

Once on the outside, Chaslot said he created a program to analyze how the algorithm was recommending conspiracy videos by using a YouTube account with no viewing history to search for certain topics and collect which videos were recommended to users most.

This means that while “good” or “harmless” videos might be included in the mix of recommendations, YouTube repeatedly invited users to click on certain videos much more than others, essentially giving them free advertising. Chaslot initially shared his research with The Guardian.

His analysis found that when searching for “Is the Earth flat or round?” the top recommendation YouTube kept showing users in the beginning of February was “THE BEST Flat Earth VIDEO | 100% Proof The Earth Is Flat | Please Debunk This I Dare You!!!!” followed by “Top 10 Reasons People Believe The Earth Is FLAT!” and “BEST FLAT EARTH PROOF 2017 – YOU CANT DENY THIS EVIDENCE.”

Searching tragedies can turn up even more disturbing results. If you searched for “Sandy Hook Shooting” in November 2017, one of the top recommended videos was the now removed “BELIEVE YOUR OWN EYES – 9/11 – ‘NO PLANES’,” followed by videos asserting the Connecticut school shooting and its victims were a hoax.

 Chaslot’s model found YouTube was recommending conspiracy videos at much higher rates than others for certain search terms. algotransparency.org

Researchers at Harvard conducted their own test and found that the algorithm was more often drawing viewers to extreme content and unfounded right-wing conspiracy theories.

Experts say the conspiracy videos are perfectly positioned to push our buttons and draw us in to consume more of them — signs that YouTube’s algorithm prioritizes, wrote Robert J. Blaskiewicz Jr., a columnist for the Committee for Skeptical Inquiry, a non-profit educational organization that applies scientific analysis to conspiracy theory claims, in an email.

“Conspiracy stories hit our emotional fight or flight triggers,” Blaskiewicz wrote. “And the stories rest on the unstated premise that knowledge of the conspiracy will protect you from being manipulated. This in itself compels people to watch and absorb as much as they can and to revisit videos.”

The emotion they provoke is contagious, he said, and they provide ready-made explanations for complex and difficult news events.

By popular demand

YouTube has said it’s simply reflecting what users want to see, and videos are chosen based on their individual profile and viewing history.

Publicly, executives have said that the recommendations algorithm drives over 70 percent of content watched on YouTube, and that they’re getting better and better at it all the time.

“Our job is to give the user a steady stream, almost a synthetic or personalized channel,” YouTube’s chief product officer, Neal Mohan, said at CES, the annual consumer tech conference, in Las Vegas in January.

“Higher watch time means more ad inventory,” said Austin Moldow, an equity researcher at Canaccord Genuity, a financial services firm in New York. “More ads, more revenue.”

But just because people are willing to watch something doesn’t mean they’re enjoying it. YouTube has to balance protecting its profits with the trust of its users. Fail to walk the line and it can begin to undermine user value, said Kara Swisher, Recode executive editor and MSNBC contributor.

“I think it’s a problem not just throughout Youtube, but Google, Facebook, all these companies is that they prioritize growth over anything else. They may not be meaning to do it, but if growth is the goal, then user experience is not the goal,” said Swisher. “Real users, the ones you’re trying to attract, go away. And so it’s in all their interests from a business point of view to clean this place up and to have more control over it and there’s a moral responsibility to create a platform that isn’t being abused by anybody”

“Good advertisers don’t wanna be next to these kind of videos either,” she added.

Exploiting the YouTube algorithm is a cottage industry. Video creators who follow the rules can earn a share of advertising revenue. Trends favored by the algorithm are quickly incorporated and uploaded by savvy creators, said Becca Louis, a researcher at Data & Society, a nonprofit based in New York. Ultimately, if the recommendation engine is promoting conspiracy videos, YouTube incentivizes creating more of them.

But more than just promoting — and prioritizing — misinformation, these digital tabloid channels can also distort democracy. Right before the 2016 U.S. presidential election, Chaslot’s research found more than 80 percent of the recommended videos favored Donald Trump. Searching “Trump” lead to pro-Trump video recommendations. Searching “Clinton” raised mainly anti-Clinton video recommendations.

No quick fix

YouTube has taken steps to reduce incentives for some of the worst offenders. Content like mass shootings are not allowed to generate advertising revenue for their creators through YouTube.

However, this doesn’t stop creators from including links for direct donations in their video descriptions, their online merchandise stores, affiliate links for apps or paid mentions within the videos.

In a statement to NBC News, a YouTube spokesperson said “our recommendation system has changed substantially over time and no longer works the way it did five years ago.” While it used to optimize for “watch time,” it now has begun to shift focus to “satisfaction,” balancing watch time with additional data points such as likes, dislikes, shares, and surveys. YouTube has also tweaked its algorithm to better show authoritative news sources, especially for breaking news events, the spokesperson said.

According to YouTube, none of the recommendation system that Chaslot worked on while he was at Google is being used today.

Critics fear that a few updates won’t resolve the core issue, said Tristan Harris, who previously worked as a Google design ethicist and is now a critic of his ex-employer, leading a new group of former technologists out of San Francisco called the Center for Humane Technology.

Using algorithms creates exponential solutions, one for every customer or citizen, but also exponential problems.

“You can’t possibly have exponential consequences with exponential responsibility unless you have an exponential amount of human thought to be dedicated to those challenges,” said Harris. “And you can’t put that just into an algorithm.”

Article Source : https://www.nbcnews.com/tech/social-media/algorithms-take-over-youtube-s-recommendations-highlight-human-problem-n867596?cid=public-rss_20180420