OKR's, YouTube and the Danger of Unintended Consequences

"All other things being equal, our goal is to increase [video] watch time", thus began an email sent in September 2011 by Cristos Goodrow to the YouTube leadership team.

He couldn't have known then what would become of that decision when, in February 2011 he joined the YouTube team, having previously worked on Google Product Search.

Per John Doerr's excellent Measure What Matters, the story goes like this.

Cristos found himself in a new team, inexperienced with the Google way of working with OKR's

The company—around eight hundred people at the time—was producing hundreds of them each quarter. A team would open a Google doc and start typing in objectives, and they'd wind up with thirty or forty for ten people, and less than half would actually get done.

As a new member, taking responsibility for search and discovery at YouTube, Cristos found himself introducing method and discipline to YouTube's goal setting approach. With the help of Shishir Mehrota, YouTube's new technical leader, a top down approach was taken to ensure OKR's would be focussed on solving big core issues.

With more than a smattering of good luck, it turned out that an engineer named Jim McFadden in Google Research Group was working on a video recommendation engine which had significant potential to increase the number of videos viewed.

At the time, YouTube were focused on views as their core metric, meaning the focus was on the number of views watched. Cristos changed that with an intuitive leap that longer watch times meant a more satisfied audience, leading to more advertising and more content creators joining the platform, what John Doerr calls "a virtuous cycle". From hereon in, from this day forward, YouTube's core metric would be watch time.

It took 6 months, but finally in March 2012, Cristos won his argument and, with Jim's prior work, launched a watch-time optimised recommendation engine.

Later that year, presumably after some early success with Jim's new engine and Cristos' goal, Shishir would gather the core team together at the annual YouTube Leadership Summit to declare their core new goal: one billion hours in daily user watch time, a tenfold increase from their present watch time numbers. To call it a mere BHAG would be a gross underestimate.


Objective:

Reach 1 billion hours of watch time per day [by 2016], with growth driven by:

Key Results:

  • Search team + Main App (+XX%), Living Room (+XX%)
  • Grow engagement and gaming watch time (X watch hours per day)
  • Launch YouTube VR experience and grow VR catalog from X to Y

This was to be a four year goal, ending at the end of 2016 with rolling quarterly objectives and key results. Shishir understood the importance of breaking the goal down into more easily digestible chunks.

I don't think we'll ever know precisely what tweaks were made to the YouTube recommendations algorithm in those 4 years, but under the supervision of Susan Wojcicki, who joined YouTube as CEO in February 2014, the team managed to hit their extraordinary goal in early October of 2016.

Then Things Started Going Wrong.

Due to the extraordinary reach available for those willing to game the algorithmic system on social media platforms like Facebook, Twitter and YouTube, unscrupulous groups like ISIS started using the platforms to spread their message of fear and terror across the world.

The response from YouTube was relatively swift, with a "call to arms" issued by Google in an effort to take down these rapidly spreading videos.

Google executives used their stage at the Cannes Lions advertising festival to say the following.

"We used to think of terrorists as people who are hiding out in caves But now would-be terrorists are hanging out online. Technology is one of the greatest tools we have to reach at-risk youth all over the world and divert them from hate and radicalisation. We can only do that if we offer them alternatives. Only on open and diverse sites like YouTube… that we can find these countervailing points of view".

It appears that they must have done something right, as to all appearances the volume of terrorist videos and content dropped down as it was deleted by YouTube's content moderation teams.

Sadly, it didn't take long for stories of YouTube being used as a tool of radicalisation to emerge again following the successful completion of their extraordinary BHAG. This time, it marked the rise of the alt-right across the Western world.

A month after their extraordinary achievement, in late November 2016 a story emerged in The Guardian opinion section from an anonymous author, describing in all-too-painful detail their fall into the world of the alt-right, in no small part as a direct result of YouTube's video recommendation algorithm.

This, I think, is where YouTube's "suggested videos" can lead you down a rabbit hole. Moving on from Harris, I unlocked the Pandora's box of "It's not racist to criticise Islam!" content. Eventually I was introduced, by YouTube algorithms, to Milo Yiannopoulos and various "anti-SJW" videos (SJW, or social justice warrior, is a pejorative directed at progressives). They were shocking at first, but always presented as innocuous criticism from people claiming to be liberals themselves, or centrists, sometimes "just a regular conservative" – but never, ever identifying as the dreaded "alt-right".

This is Nothing New.

Radicalisation on the internet is nothing new, with the potential of social networks to be misused for radicalisation being a topic of serious study following the 9/11 terror attacks. Studies specifically into YouTube's contribution to this phenomenon were conducted well before Cristos et al had even joined the company.

Conway and McInerney in 2008 published a paper titled Jihadi Video and Auto-radicalisation: Evidence from an Exploratory YouTube Study.

The establishment of YouTube and similar video-sharing sites, on the other hand, brought about a democratisation of access to jihadi video content as a result of the significant decrease in costs they introduced. Not only did YouTube become an immediate repository for large amounts of jihadist video content, but the social networking aspects of the site also facilitate interaction between those who post video and those who comment upon it thus opening new possibilities for a.) radicalisation via the Internet, but also b.) empirical analysis of same.

While this was only an exploratory study, it did come to some notable conclusions, proving that jihadist content was spreading far beyond what would traditionally be considered jihadist websites, they were quickly embracing video sharing and social networking to extend the reach of their content outside of what could be considered their core support base.

Bermingham et al in 2009 published a study titled Combining Social Network Analysis and Sentiment Analysis to Explore the Potential for Online Radicalisation where they explored the following

The increased online presence of jihadists has raised the possibility of individuals being radicalised via the Internet. To date, the study of violent radicalisation has focused on dedicated jihadist websites and forums. This may not be the ideal starting point for such research, as participants in these venues may be described as "already madeup minds". Crawling a global social networking platform, such as YouTube, on the other hand, has the potential to unearth content and interaction aimed at radicalisation of those with little or no apparent prior interest in violent jihadism

This study was quick to mention that, while prior research had focused on jihadi content, it would be useful to also study violent groups such as neo-Nazis and other groups with a history of violence.

This begs the question to be asked: why weren't YouTube's Product Managers sensitive to these particular topics and why did they not do everything in their power to reduce the spread of hate using the tools in their power?

How YouTube Failed us

15-year-old David Sherrat became radicalised by the far-right through online videogame communities and content. His story, published by The Daily Beast, echoes the story of so many young people who fell into the rabbit hole thanks in no small part to YouTube's video recommendation algorithms.

That fixation on watch-time can be banal or dangerous, said Becca Lewis, a researcher with the technology research nonprofit Data & Society. "In terms of YouTube's business model and attempts to keep users engaged on their content, it makes sense what we're seeing the algorithms do," Lewis said. "That algorithmic behavior is great if you're looking for makeup artists and you watch one person's content and want a bunch of other people's advice on how to do your eye shadow. But it becomes a lot more problematic when you're talking about political and extremist content."

In Brazil, 16-year-old Matheus Dominguez fell down the far-right rabbit hole having being recommended a video from a far-right guitar teacher while watching guitar related content.

One day, it directed him to an amateur guitar teacher named Nando Moura, who had gained a wide following by posting videos about heavy metal, video games and, most of all, politics.

In colorful and paranoid far-right rants, Mr. Moura accused feminists, teachers and mainstream politicians of waging vast conspiracies. Mr. Dominguez was hooked.

His story wasn't a mere isolated case, with the now-President formerly being a highly successful star within Brazilian YouTube's far-right. Farshad Shadloo, a spokesman at Google event' went on to say "we've seen that authoritative content is thriving in Brazil and is some of the most recommended content on the site." This is a clear indicator that Google and YouTube were all too aware of the consequences of their video recommendation algorithm.

Zeynep Tufekci also discovered this firsthand when she spent some time watching videos of Donald Trump on YouTube during the 2016 presidential campaign.

At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and "autoplay" videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

In a report by Bloomberg, it was found that not only were Google aware of alt-right content, it was considered to be as important to the business as content from music, sports or gaming.

Some employees still sought out these videos anyway. One telling moment happened around early 2018, according to two people familiar with it. An employee decided to create a new YouTube "vertical," a category that the company uses to group its mountain of video footage. This person gathered together videos under an imagined vertical for the "alt-right," the political ensemble loosely tied to Trump. Based on engagement, the hypothetical alt-right category sat with music, sports and gaming as the most popular channels at YouTube, an attempt to show how critical these videos were to YouTube's business.

What emerges from these varied reports is a clear picture of the harm done by YouTube in the pursuit of their BHAG, in the children who were explicitly shown content that led them into the embrace of the alt or far-right and in the subsequent spread of racism and misogyny across the Western world.

Failure as a Service

"When a measure becomes a target, it ceases to be a good measure"

Goodhart's law serves to illustrate how complex systems can exploit crude measures, a failing that has become all too clear through YouTube's single-minded obsession with hitting their watch-time goals and the fallout we now find ourselves in as a result.

All that remains of a minor fragment of a poem, written by the Greek poet Archilocus of Paros two thousand six hundred and sixty years ago (give or take a few decades), is the line "The fox knows many things; the hedgehog one big thing". The original context is long forgotten, but Isaiah Berlin in a hugely influential essay titled "The Hedgehog and the Fox" used it as a method of classification for writers and thinkers. Hedgehogs, viewing the world through a single defining idea and foxes, drawing on a wide variety of experience and who cannot narrow the world down to a single idea.

John Lewis Gaddis, the Robert A. Lovett Professor of Military and Naval History at Yale University, in his book On Grand Strategy takes this metaphor even further.

A hedgehog may use a compass to know true North, but in the process of getting there may plunge headlong into swamps and chasms that bar his path.

And so it is for YouTube and their watch time goal. A veritable swamp that has swallowed YouTube whole, and, in the process, Western democracy as we know it.