How consumer data is shaping creative

Thought Leadership
charlie.cottrell
WARC recently published this article by Head of Editorial, Charlie Cottrell, examining the increasing complex world of consumer data and how creative can work to get the balance right so that the resultant ads feel like a service and not an imposition. They’ve been kind enough to let us reproduce it below.

The volume of consumer data that is now being harvested by platforms and machines is astronomical and becoming increasingly sophisticated. This exponential growth is opening up new and exciting opportunities for marketing creative but at the same time creating moral dilemmas vis-à-vis the way this information should be used.

So, what kind of data is being captured? In the app world, the prize went to Amazon’s Alexa, which topped last year’s charts for both Android and iPhone downloads. The quirk and convenience of being able to speak, Jean-Luc Picard-like, to a robot valet trumped fearful headlines in the tech and tabloid press, suggesting voice-operated devices might be discreetly eavesdropping on your conversations and sending valuable insights to third-party data collectors and maybe even the CIA.

We’re becoming used to giving up access to our data in exchange for something. When we shop online, we log in with Facebook to avoid the banality of filling in account details. We allow social media platforms access to our profiles and we accept ‘cookies’ to view content on websites. Beyond the obvious form-filling, there are more subtle data points we’re sharing: we leave clues about our preferences all over the internet each time we like or comment on something; and using our new favourite voice-controlled helper, Alexa, offers up new demographic variables such as intonation, word choice, pitch and pace.

Sometimes, you need to be creative to get to the data. To understand what makes a voice-activated service popular in the wild, Amazon has a reward scheme for developers whose Alexa Skills keep people engaging with the Echo for longer. This is a win-win: creatives and developers are incentivised to make category-defining work across genres – such as food and drink, health and fitness and lifestyle – and the operating system learns from real human data how to deliver a better user experience. In other words, when you’re a few cocktails down and barking at Alexa as a party piece, you’re really contributing to the advancement of voice technology.

All this new data is allowing us to paint more textured character profiles of our consumers. Think of the classic pen portrait that invites us to ‘meet Janet, a 35-year-old busy, working mother of two…’ complete with a clichéd summary of her preferences, and a stock image of a maniacally grinning model. This is laughably vague; now, data has given us the ability to subsect an audience by mindset, fears, actual versus reported behaviours, shopping history, location, current mood, heart rate and favourite emojis.

However, data alone is not a silver bullet; we have to know how to attack the information. This is where AI comes in. As robot brains become more sophisticated, they’re able to replicate the human brain’s method of absorbing data, looking for patterns, following hunches, testing models and analysing the results.

They do what we do, but infinitely faster and without the burden of human limiting thoughts – IBM Watson isn’t carrying around a niggling childhood doubt that it’s not good at art. The application of computerised assistance allows us to cast a net into this vast sea of data points, to help us advance creatively in three main areas: identifying an opportunity, shaping the creative execution, and sharply targeted distribution.

As marketers, we know we’re competing with more people, on more platforms, for our consumers’ attention, so there’s even more pressure to uncover an untapped insight, which can springboard the creative and deliver results. Whether you approach this quest via blue-sky thinking or using meticulous strategic frameworks, it takes time – and that’s at odds with the pace of modern, always-on marketing. It’s not just a great insight that’s important, it’s the speed at which you can arrive at that insight.

On the IBM Watson website, the ability to cross-reference Big Data at speed is throwing up interesting ‘what if?’ questions that could give tangible advantage to marketing campaigns: ‘What if the weather could trigger personalised consumer interactions?’ seems especially pertinent to the UK. It proposes knitting together weather forecasts with data about previous consumer reactions to weather and postcode information. This would give a savvy brand the heads-up to get a timely piece of creative live, before their competitors wake up and realise it’s unseasonably sunny.

Shaping creativity
In 2016, Dutch bank ING used data and AI to create a portrait in the style of an Old Master: ‘The Next Rembrandt’ used thousands of diverse data points to produce a suitably faithful image and sparked the narrative that robots were coming for our creative jobs. Creativity, thankfully, is more than the ability to reproduce an image at speed. In terms of inspired artistry, The Next Rembrandt is really just a sophisticated inkjet printer. In a similar vein, being concerned that AI tools are getting better at using data to create images or generate copy misses the point. It’s like being upset that Excel can make a pivot table faster than you can. The logic that stops us smashing the looms and shrieking ‘Witch!’ at Microsoft Office is applicable here too; we should regard data-powered computational creativity as a tirelessly diligent and proactive assistant.


The Next Rembrandt: used thousands of diverse data points to produce a portrait in the style of an Old Master

Using recipes to market a food magazine is not creative dynamite, but using Big Data to create those recipes was. Working as a turbo-powered apprentice, IBM Watson helped Bon Appétit magazine to analyse 10,000 recipes and a fantastical-sounding quintillion possible ingredient combinations to develop a new recipe range, including combinations such as ‘ginger, compote and cod pizza’ and ‘shitake and Gruyère ice pops’, which no human chef would consider pairing, but which proved to be delicious.

In its first year, The Washington Post‘s data-powered AI reporter, Heliograf, published 850 human-sounding articles covering everything from the Olympic Games to gubernatorial campaign trails and high-school football. It’s one of many journalistic bodies, including the Associated Press and USA Today, outsourcing time-consuming and formulaic reporting work to AI, freeing human journalists to focus on investigating stories.

Traditionally a passive medium, TV is using data to bring shows to life in ways that are much more joyful and seamless than second-screening. Netflix is known for using its incredible bank of viewer data to inform how it commissions and casts shows. Last year, this process became even more directly consumer data-driven, through a partnership between HBO and Google. Google Home-owning viewers of Westworld were immersed in the dystopian world of the TV show via ‘Aeden’, a chatbot designed to tease, mirror and allow users to participate in the action of the series. The show’s writers helped to script the bot’s responses to viewer questions, and those questions gave the writers data on how viewers were reacting to the show, which was used to shape the plotlines. The chatbot went on to win an Emmy Award for Outstanding Creative Achievement in Interactive Media within a Scripted Program.

One-to-one at scale
Dynamic Social Video sits at the crossroads of creativity and distribution. DSV is a video ad format, made up of fixed and editable segments. Using consumer data, each of the editable segments can be switched out, instantly, to personalise the ad to the person to whom it is being served. Let’s say you’re a beauty brand that wants to drive purchase of products in the run-up to wedding season. You run an advertisement made of six segments: the first and last have the brand logo and will appear the same to everyone who sees the ad; the four variables that could be customised could be age, skin colour, income and location. So a 20-year-old, olive-skinned person, with limited disposable income, in Birmingham, might be served a version featuring a ruby red lipstick and a call to action to visit the beauty counter at a local department store.

Alternatively, a 45-year-old person with black skin and more cash to splash, in London, will see a version advertising a personal makeover service in the brand’s flagship store. Results indicate these personalised ads are significantly more effective than generalised ones. Both sides of the deal benefit from the use of data-powered creative in this way; brands have a cost-effective, better targeted way to do one-to-one marketing, and consumers are shown ad content that’s actually useful to them.

Let’s look at the butler vs. the stalker scenario: you’d probably prefer to have one rather than the other. When it comes to user experience and marketing, this is the difference between the welcome text message from your airline, reminding you online check-in is open, and the annoying jeans ad that follows you around the internet like a casually trousered ghost. Getting this balance right relies on having the correct creative wrapper to allow you to apply the incredible wealth of consumer data we can collect and making ads feel like a service, not an imposition.

To this end, patents filed by Facebook push data capture into bold new territories; this is the realm of emotion-based information. Details such as the speed and pressure of typing can be indicators of the user’s mood, allowing Facebook to automatically insert ’emotional elements’ to the text – a build on the functionality that cascades confetti when you type ‘congratulations’ on the platform. Another patent covers using visual cues to track a user’s emotional reaction to content by capturing images of their face through their smartphone or laptop camera. These facial ‘tells’ can be analysed to serve the most effective content to that user, in real time.

Just because we can do all this, does that mean we should? Is clicking an ‘agree’ button when you set up your Facebook profile sufficient permission for the platform to activate your camera and use your subconscious micro-expressions to tap into your brain’s reward centre?

Emotion-tracking removes the need for the consumer to articulate, let alone type, making it easier to indirectly market to children, for example.

If your device detects you are happy, should a platform serve you ads for alcohol to celebrate, or charities to share your good fortune? If it detects you’re depressed, would it serve you content from the Samaritans, or retarget you with a treat you might have left in your Amazon cart, or would it ping that information to your health insurance company so they can adjust your premium? Who is it that makes these decisions? Can something as morally complex as someone’s emotional state be auctioned off to the company with the highest media budget?

Values come into play here. By pairing data and creative in ever more sophisticated ways, companies will begin to define the acceptable boundaries of what can and can’t be exploited for commercial gain. The onus will be on brands, agencies and creatives to declare and stand by their values. Even if that means sacrificing the bottom line.