How it all started

I wrote this story awhile ago, before we had this blog started, to eventually publish here.

It all started in August 2007, when I had to travel across the continent almost every week and started shopping for a lightweight laptop computer suitable for business travel.

While selecting parameters for my planned purchase in terms of specifications was not too difficult, if you know what you want, predicting the quality of your ownership experience may prove to be much more complicated. I’ve been an online shopper for a long time, but the recent cost of shipping and the hassle of dealing with less and less responsive customer service reps, started to outweigh the original savings and conveniences.  As a result I became much less impulsive with my online purchases – in this economy there is less money to spend and less time to waste, so I figured a small investment into initial research would be a good thing.

I found three major sources of information that were quite useful to assist me in making the choice:

  1. Product specifications provided by the Original Equipment Manufacturer that are part of advertising and marketing collateral and designed to create our expectations of functionality and performance, but provide little help to gauge the probability of these expectations being met;
  2. Editorial Reviews provided by magazine and online publishers that offer us a glimpse of potential user experiences which are quite valuable, but substantially removed from the regular consumer environment – the editors test carefully selected and tuned equipment provided at no cost to them by the OEM, and don’t have to deal with fulfillment, delivery and customer service issues. Unfortunately Consumers Reports did not review the laptops I was interested in and their recommendations were not available. A relatively quick tour of a few popular web sites helped me to create a short list of the two laptops that met my requirements;
  3. Consumers (Users’) Reviews provided by actual purchasers of the product who share their personal experience and rate their satisfaction with this product. There is a relatively high probability that one will be very satisfied with a product that has been rated very highly by most reviewers. Adversely, you will do well avoiding products that are rated very low by most reviewers.

Two laptops that I had short-listed for purchase, based on product specifications and editorial reviews, had very similar reputation ratings of 3.5 stars out of 5, which is not perfect, but acceptable. So what is the next step? Toss a coin? Is it safe to assume that these two products have the same reputation and would be equally satisfying purchases?

It turns out that this would have been a very bad assumption.

Usually I lack in patience (those who know me, please stop laughing – “Good men know their limitations”), but this time I decided to assess whether the reasons, that prevented these laptop users from giving the highest satisfaction ratings of 5, are “showstoppers” for me, or not – we all have our own limits of tolerance to different experiences. I ended hunting for and reading through dozens, or sometimes hundreds of reviews and found out that the most negative experiences with laptop #1 were centered around overheating issues and resulting customer service hassles, where negative reports of laptop #2 were focused on order processing and fulfillment problems.

I had invested 8-10 hours of my time in research and spent $120 more than originally anticipated, to purchase the laptop #2 from an online retailer that had it in inventory, to bypass the fulfillment problem and have been enjoying my laptop without any reliability problems. I hear the laptop #1 OEM has finally found workarounds for the overheating problems and is now working to pacify many of their very vocal and unhappy customers.

As smug as I am about this experience, the efforts required to do such research require too much time and patience and I wondered how much easier it would be to have more meaningful product reputation ratings. So I looked, but could not find anything unbiased, consistent and verifiable to make it work for me. That is how my new project, Amplified Analytics was born. Please look around the site and let me know if it makes any sense to you. I would love to hear your experiences related to product reputations and user reviews.

This entry was posted in Customer Intelligence and tagged , . Bookmark the permalink.

2 Responses to How it all started

  1. The last few laptops I bought, I did so in less than 15 minutes each! They were rather expensive because very lightweight (my last laptop traveled over 100 000 air miles already). Spending a day on choosing which laptop to buy would carry an opportunity cost similar to the value of the laptop, so I take another approach: within the same price & functionality category, most products are probably of approximately the same quality (it’s like comparing BMW against Mercedes, both are excellent, although different). The question is, does that make me a (typical/potential) user of Amplified Analytics, or does my behavior rule me out?


  2. Gregory says:


    Thank you for your comment.

    Many great brands have experienced product issues. I recall a look of sheepish embarrassment on the face of BMW owner, as his brand new, fine driving machine was towed to the dealership for repairs.

    There are no guaranties in life, but I want some comfort of knowing, from experience of other people who bought this product, that probability of getting a problematic quality and poor support, is low.

    The typical/potential user of our products is a product/marketing manager, who cares enough to know what is the product reputation on the market place and what he/she needs to do to make it better.

    A consumer is a final beneficiary of the process, and can choose to participate in co-creation of value by sharing their Customer Experience at any Social Media venue and selecting only the products that create quality Customer Experiences for their customers.

Comments are closed.