What People Say is Often Different from How They Feel
TLDR: Self-Report Data is often very limited and biometrics sometimes tell a different impact story. Understanding this limitation is critical in high importance situations – such as a Presidential Debate.
In this independent study, MediaScience used eye tracking, biometric measurements, facial coding and self-report to analyze the 2016 2nd Presidential Debate LIVE across our labs in Austin and Chicago. Our data showed that emotional dynamics of audiences was very different than the rational self-report results from a post-exposure survey.
Self Report data from the debate indicated that Democratic Nominee, Hillary Clinton, was a clear favorite and that she had won the debate. However when we analyzed the biometric, eye tracking, and facial coding data, President Donald Trump created strong biometric responses among Undecideds and Republican voters.
We found that Trump was able to desensitize the audience regardless of party affiliation over the course of the debate. What was surprising was that Clinton did very poorly with Democrats – our facial coding data revealed negative valence measures among voters that self-identified as Democrats. When she spoke about the email scandal and her planned policies, she received negative valence scores.
Further, there was a significant gender divide on specific topics such as the Trump Sex Tapes and questions about the Middle East that both candidates spoke on.
WANT TO DOWNLOAD THIS CASE STUDY?
BECOME A FREE MEDIASCIENCE READER
Join our mailing list to receive our latest case studies and information.