Netflix is your favorite streaming service. If you've missed their existence so far, you haven't entered their audience test criteria yet. But don't worry, you will. It's just a matter of time.
Netflix has a self-proclaimed "A/B testing culture: nearly every decision [they] make about [their] product and business is guided by member behavior observed in test."
Just a few examples of what they're testing: UI, algorithms, messaging, marketing, operations, infrastructure changes, title artwork, personalization algorithms, video encoding... the list goes on. In fact, they've built an entire architecture for their data scientists to enable quicker iteration. If you want to learn more about their many experiments, you can read more on their dedicated research blog.
For now, we'll focus on how they test their artwork displays for each film they present to members. Their main experimental question is 'how can we improve the CTR of the first glance when a member opens Netflix?' To set up the experiment, they had their creative team build multiple versions of each film in question. Each member would get a different artwork variant. Then they would measure CTR as well as aggregate play duration, fraction of plays with short duration, fraction of content viewed... in short, they looked for measures of engagement.
Then, they went multivariate. Rather than simply switching up the full artwork in an a/b/n test, they began labeling components (named Stable Identifiers) in their artwork and systematically varying their appearances. Next, they ran all the variants to determine which pieces were most effective, in addition to which images were the overall best performers.
Both tests were resounding successes. With the first film they tested, they found a 14% higher CTR for an image variant. With their multivariate tests, they discovered a winning title format, as well as a subtler truth, "images that have expressive facial emotion that conveys the tone of the title do particularly well." Netflix continues to test artwork, as well as billboards, trailers, montages, and, of course, ads with the same multivariate system, attempting to find the best assets for each title on any canvas.
Next time you flip on Netflix, ask a friend to do the same and see if you can spot differing artwork. You can learn how to run multivariate tests yourself using Marpipe, for free, with our experimentation guides.
Website:https://www.netflix.com/