When choosing a social media monitoring tool, there are lots of questions to consider: why are there so many approaches and services? How are they different? What justifies the price variances? Can’t we get this for free? What is being measured? What resources should we invest? In this presentation Brad Little (Nielsen) examines the differences between social media monitoring tools, how they work and what to consider when choosing a provider. He will aim to get beyond the sales hype and look under the bonnet to help you select the right solution for your company.
15. That buzz is up over 5X is a fact (not an insight) Page
16.
17. Thank You! Brad Little @bradleyjlittle BuzzMetrics, EMEA [email_address]
Editor's Notes
1) We don’t measure TV the same way we do Newspapers, so why would anyone think that Twitter is the same as Facebook, or a Blog? The context that many still talk about social media is still as if it’s strength is as an advertising paradigm. What about unique insights (an unaided focus group)? What about Customer Services or Product Development? How about eReputation, threats, PR/Comms, and just abut every part of your business? 2) If you spent 50 million investing in technologies to measure this stuff, would you give it away free – no (unless you are Google of course). 3) Do organizations really want to do this - or does saying it just help share price?
1) Yes, but it will do some things, some of the time, and to some degree that you need it. It will rarely do all (or most of what you need) 2) Maybe, but that is an awful comparison. Google will include spam, and content not considered CGM/WOM. 3) This is a terrible way to do this. If one company uses traditional media clipping feeds or twitter data, for example, they would jump in coverage, but it may not be what you want to use the tool for. What about SPAM and redundancy – the one with the most of these win! 4) These are typically terrible. A tool-based provider loves to give away free trials because that is all they do. For a more well-rounded provider, to do a good trial means configuration and analyst time to build very good classifiers/topics (which limit erroneous messages) and tight service and support to get your team working with the tool efficiently. It’s not that these tools are clumsy out of the box, it’s just they can work so much more efficiently and effectively in the client invests the time with a partner to get them set up right. When you get a new PC, does it work just how you like it out of the Box? Nope – not even a Mac
3) Where is the data from? Is it pumped out of a tool, or are people getting involved? What are their skills and what level is their involvement?