Social Media Monitoring vs Customer Intelligence Analysis

Based on the questions I often get from marketing practitioners after webinars and speaking engagements, there is considerable confusion about the difference between Social Media Monitoring and Customer Intelligence methodologies. Below is my first attempt to establish a clear demarcation line between the two approaches. Please help me to refine this matrix with your feedback, comments and disagreements.

 

Social Media Monitoring

Customer Intelligence Analytics

SMM captures and measures Word of Mouth communications generated by anybody or anything: consumers, bloggers, marketers, pundits, industry analysts, customers, automated repeating and SEO software. The content originates from Social Media networks and other public (Internet) sources.

 

CIA captures and measures only customer communications about a product or service they have purchased and experienced. The user/customer-generated content (USG/CGC) can originate from public (Internet) or private (company) sources.

Transactional analytics, i.e., how much buzz there is about keyword=XYZ and whether it’s positive or negative. Focus is on BUZZ.

Contextual analytics  – why customers purchased this product, what they do like and don’t like about their experience, and to what degree. Focus is on Customer Experience.

 

Provides two-dimensional measurements per keyword provided – velocity and sentiment, i.e., how fast and furious the communications are generated, and whether they are negative or positive.

Provides three-dimensional measurements per product/service – discovers what attributes of customer experience with the product are important to customers, measures how important each attribute is, and measures the difference between customer expectations and customer experience with each attribute.

 

Immediate to short time frame communications are monitored and trended.

Time frame is determined by a product life-cycle and trending of post-shipping customer feedback metrics.

Excellent for PR and Customer Support Crisis Management applications.

Excellent for Strategic Marketing, Marketing Communications Effectiveness, Product Management, Customer Support Management, and Purchasing Management applications.

 

This entry was posted in Customer Intelligence and tagged , , , . Bookmark the permalink.

11 Responses to Social Media Monitoring vs Customer Intelligence Analysis

  1. Eric Abdullateef says:

    Hi Gregory:

    As promised, here I am leveraging your good work.

    Proposal

    TITLE Social Media Monitoring – Donor Intelligence Analytics for Tech Savvy NGOs

    AUTHOR: Eric Abdullateef, M&E Specialist, eabdullateef@usaid.gov Phone: 571-215- 0541

    Abstract

    Social Media Monitoring (SMM) in the NGO context captures and measures peer-to-peer or word-of-mouth communications generated by anybody or anything concerning one or more of the key issues that your NGO is advocating for. SMM entails taking a pulse of public sentiment whether among donors, supporters, bloggers, marketers, pundits, analysts, beneficiaries, or even surveillance of automated search engine optimization (SEO) software. Pulse-taking data originate from social media networks, online communities of practice (CoP) and other public (Internet) venues. This article proffers a formative appraisal approach called the Rapid Appraisal Process (RAP). It enables communication staff to extract policy positions from “virgin” texts found in a purposeful scan around the relief and development blogosphere. Online social media spaces capture America’s dialogue of our relationship with the world. They can be analyzed at a point in time or over a period of time.

    Whether for fundraising, advocacy or education, NGOs have regular necessity to engage many publics. With limited resources and capacity, it is a challenge for NGOs to plan and communicate strategically. The Rapid Appraisal Process (RAP) highlighted here offers NGOs a simple way to teases out the complexity of public attitudes towards their focal issue areas or projects. It enables them to strategically reshape the discourse against the backdrop of public engagement good practice, the politics of the day and a rich body of literature on humanitarian and development communications.

    Relevance Statement

    There is clearly a need for innovative and creative ways of screening global issues to different publics, if US-NGOs working abroad are to animate a new public debate around the limits and potential of international humanitarian cooperation.

    Keywords

    international development communications, development education, public affairs, foreign aid, foreign assistance, public engagement, public diplomacy, advocacy evaluation

  2. Gregory says:

    Thank you Eric. Please let me know if you think about any more differences between the two, I have missed in my matrix.

  3. Michael Hollon says:

    I do not have any disagreements, but I’m still not clear on the top issues in my mind about either SMM or CIA. And that is reliability.

    I’m still in need of being convinced that any platform/ algorithm / sentiment analysis (or text analysis) / natural language tool, can reliably and accurately summarize unstructured feedback from consumers. I’ve read authors of various articles that have analyzed actual human written language with all its nuances and vagaries, as measured by human beings, and compared that measurement to analysis from various automated systems. And their findings assert that the various types of automated analysis over-promise on accuracy and reliability.

    I’d be grateful for any thoughts that can move my thinking forward.

  4. Gregory says:

    I used to have a sign on my desk that read “Happiness is in expectations management”. Personally I do not expect 100% accuracy from humans or algorithms when it comes to scoring, rating or interpreting customer experience. Please note that I specifically focus on unstructured (text) data describing an experience with a product or service.

    I’ve never met a serious NLP pro who would claim algorithmic accuracy over 80%, however nobody could be very specific about methodology of measuring this accuracy. In addition they spoke about purely statistical models.

    I can only speak about our methodology and measurement of accuracy.

    1. We have extracted 100 customer reviews about the same product, but from different customers and different sites (data sources).
    2. We removed the author’s scores for overall sentiment rate of the product;
    3. We hired a trained Market Research intern to read and score each review based on the content;
    4. We added the intern’ the scores and produced average score for the batch;
    5. We processed the same batch by our algorithms that scored each review;
    6. We added the algorithm’s scores and produced average score;
    7. We added the customer’s scores for the same batch and calculated the average score.
    8. We compared the 3 numbers
    9. We assumed the intern’s score to be the “gold” standard

    The customers, self assessed ave score was 81.3% off the “gold” standard
    The algorithmic ave score was 92.7% of the “gold” standard.

    These results came from the last of 15-20 studies we conducted over a period of development and every time the algorithms were more accurate than customer assessed scores. The interesting twist came when we have asked two (and later 3) different MR interns to score the same batch of content – they have come close to 10% variance in their scores, i.e. interpretation of the content. My point is that algorithmic analysis of unstructured data produces more consistent results than human beings.

    Reliability has to be defined in some specific in practical terms and context of discussion. I would like to propose that in this context – a technology can produce reasonably accurate, consistent, and transparent results much faster and more economically then humans.

  5. Thanks for sharing the information. Keep it up.

    Thanks
    Michael

  6. dominique says:

    Very interesting distinction. It’s true that many people confuse monitoring and insights capture/discovery.

    One thing…You can extend the Intelligence analysis to non customers i.e taking the example of HP (one of our client) you can get insights from
    – CIO
    – Security experts
    – Cloud computing world

    This requires that you profile a large number of people in these different segments and that your listening/analysis is open and less directed than monitoring.

  7. Gregory says:

    Dominique,

    Thank you for your comment.

    I agree. When a subject of your interest impacts an experience of a number of individuals, the extended analysis can produce very reach intelligence.

  8. Pingback: The Contextual Side of Customer Experience Analytics | Amplified Analytics Blog

  9. Pingback: The Contextual Side Of Customer Experience Analytics

  10. Pingback: How to make Social Media relevant to Business | Amplified Analytics Blog

  11. Pingback: Is Social Media Relevant To Business?

Leave a Reply

Your email address will not be published. Required fields are marked *