In the past, implicit and explicit responses to stimuli were gauged in a testing (non-natural) environment.In the scientific community, people most often distinguish only between system1 and system2, and consequently have to put perception into one of the buckets, then of course system1 would be more suitable than system2.But our unique claim is that there is a third system, independent of the two, which system 0.System 0 is a market research innovation in which ads are tested in their natural environment where respondents are not aware of ads being tested and implicit consumer response and explicit behavior are recorded.
Bad ads create bad experiences for consumers and negatively impact your brand. It is critical to know what creative element or pattern can help drive brand performance and direct response. However, creative A/B tests or lab tests with facial expression analysis are expensive and not scalable. In this research, we used Google Cloud APIs to compute thousands of features from large creative samples to conduct creative meta-analysis. We added those features from machine learning and computer vision with human encoding elements.
The ad impact is often not obvious. Surveys provide blurry and quite contradictory answers. Behavioural tools turned out to be a godsend! By testing top brand campaigns with nonconscious research tools, we received unexpected non-trivial insights.
The ad impact is often not obvious. Surveys provide blurry and quite contradictory answers. Behavioural tools turned out to be a godsend! By testing top brand campaigns with nonconscious research tools, we received unexpected non-trivial insights.
Course5's approach proposes to complement traditional ad-testing and provide insights support to the brand marketer through a simple online tool which uses AI to mine past data and pre-evaluate new creatives. The paper talks about how we can use AI to help address these business questions. It specifically demonstrates the use case of optimizing a creative by providing inputs to help improve its chances of success. This is done with insights related to branding, using Intel's 'past research data' and computer vision/audition algorithms and machine learning technologies in the Course5 Research Suite.
Ads which trigger any emotion work better than those that don't. Ads which trigger the right emotion work even better. A problem, however, has always been detecting unspoken feelings; the real kinds of emotion that an ad generates. The aim of this study is to enable VF to know which emotions do their brands elicit, decide if these emotions are aligned with the brands, and to determine if emotional targeting was used earlier in the creative process how much better would Ad performance be?
Together with Kantar, Zappi conducted a research study to determine the trends and creative traits that can help advertisers maximize efficiency. We tested 20 video ads across four categories using our consumer insights platform. This paper shares the key takeaways.
Together with Kantar, Zappi conducted a research study to determine the trends and creative traits that can help advertisers maximize efficiency. We tested 20 video ads across four categories using our consumer insights platform. This paper shares the key takeaways.
Is it possible to test whether branded content met its campaign objectives? See how the BBC combined the latest in emotion tracking tech with devilish psych methods to product a new research tool that is taking the advertising world by storm. In 2017 we set out to create the SOE (Science of Engagement) Toolkit; a new campaign effectiveness tool for content marketing. A tool that focused on measuring emotion, and correlating the emotional effects with change in subconscious brand association.
Is it possible to test whether branded content met its campaign objectives? See how the BBC combined the latest in emotion tracking tech with devilish psych methods to product a new research tool that is taking the advertising world by storm.
This paper introduces several new methods aimed at improving ad testing. The power of interactive television and scientific statistical techniques are leveraged to create a unified methodology to test ads. The methodology makes three specific improvements, including the utilization of a random probability sample, the development of a system that allows for self-administered ad experiments given in a respondents own home, and the employment of a fully randomized, panel experimental research design. The paper draws on empirical evidence from the Yale Advertising Study, focusing on measuring the effect of political ads on voter attitudes and behavior. The study included 12,350 interviews.