Stop! Is your methodology biased? Ad measurement provides 'Accountability' (to prove that ads work), however, we argue that measurement should produce 'Incrementality' (help businesses grow with ads). To measure true ad effectiveness - incrementality, we have to move away from the long-accepted methodology: pre vs. post-campaign or non-exposed vs. exposed. For this presentation we will redefine ad measurement and demonstrate how we measure it at Google with examples.
Stop! Is your methodology biased? Ad measurement provides 'Accountability' (to prove that ads work), however, we argue that measurement should produce 'Incrementality' (help businesses grow with ads). To measure true ad effectiveness - incrementality, we have to move away from the long-accepted methodology: pre vs. post-campaign or non-exposed vs. exposed. For this presentation we will redefine ad measurement and demonstrate how we measure it at Google with examples.
In today's world it is well known that information comes from many different sources, simultaneously and in very different formats. In this context advertising can no longer be considered as one homogeneous platform, but as a separate unit within a media plan. Marketers need to think and do, faster than media consumption and for this reason it becomes increasingly necessary to recognize the role/importance of cross-platform campaigns for maximising the efficiency of marketing investments overall. Because, let's be honest, modern times require us to do much more with much less. With our presentation, we will share ideas about how to potentially increase the campaign's reach and efficiency using a cross-platform marketing strategy.
In the age of performance and programmatic marketing, advertisers have come to expect speed, agility and accuracy in everything from target audience understanding, to communications planning, and measuring campaign effectiveness. This paper details ways to integrate declaration-based insights from survey data with near real-time behavioural insights from other sources. These two data sets can work together to create one powerful dataset that retains the best of both outputs to deliver new and exciting opportunities to engage today's media savvy, and often media fatigued, consumers. Three data integration projects are shared as case studies.
As more advertising spend migrates online, thereâs a need to understand differences between online ads that drive short-term sales and those which drive long-term, profitable brand growth. This need is particularly acute as the industry faces pressure to prove ads are actually being seen, and from consumers increasingly blocking intrusive ads. Moving the profit needle digitally has never been so important, but itâs never been so hard!
As more advertising spend migrates online, thereâs a need to understand differences between online ads that drive short-term sales and those which drive long-term, profitable brand growth. This need is particularly acute as the industry faces pressure to prove ads are actually being seen, and from consumers increasingly blocking intrusive ads. Moving the profit needle digitally has never been so important, but it's never been so hard! The key factor we explored was the role of emotion in digital advertising. The role of emotion in making TV and online video advertising is well known. But emotion tends to be underplayed as profitability in digital advertising.
Consumers are not necessarily interested in most brands or ads. Usually they do not have time to reflect on what the ad means for them and the brand. This means that it is necessary to understand the way consumers are processing ads. An optimal understanding is key to knowing the performance of your ad as well to knowing how to optimise your ad. Heineken has conducted three different studies to evaluate a commercial from one of Heineken's International brands: traditional quantitative pre-test and two neuro science approaches ( EEG & Eye-tracking and EEG in socialising context). This will deliver an optimal future framework for pretesting commercials concerning performance, optimising ads, and the relevance of measuring in social context.
Digital advertising is one of the largest and open playgrounds for machine learning, data mining and related analytic approaches. This development is fueled by unparalleled access to consumer activity across all digital devices and the rise of programmatic advertising where about 100 Billion advertising opportunities are sold daily in real time auctions This talk will outline the realities and promises of infusing a historically intuition based industry with rigorous machine learning for both targeting, creative optimisation, and measurement. We will also touch on a number of challenges which arise in this environment: 1) high volume data streams of around 30 Billion daily consumer touch points, 2) low latency requirements on scoring and automated bidding decisions within 100ms and 3) adversarial modeling in the light of advertising fraud and bots.
Consumers are not necessarily interested in most brands or ads. Usually they do not have time to reflect on what the ad means for them and the brand. This means that it is necessary to understand the way consumers are processing ads. An optimal understanding is key to knowing the performance of your ad as well to knowing how to optimise your ad. Heineken has conducted three different studies to evaluate a commercial from one of Heineken's International brands: traditional quantitative pre-test and two neuro science approaches ( EEG & Eye-tracking and EEG in socialising context). This will deliver an optimal future framework for pretesting commercials concerning performance, optimising ads, and the relevance of measuring in social context.
Ad fraud is an ever increasing curse of digital advertising. It is caused by bots that cause webpages and ad impressions to load and make fake clicks. This fraudulent activity "messes up" measurement, which means the business decisions made based on the analytics could be entirely wrong. Find out the two main forms of digital ad fraud, how to detect it, and how to correct for it in the data.