Tuesday, September 1, 2015

Text Analysis - Understanding Operational Change

I had a very interesting conversation with a customer recently.  The conversation resulted in the customer signing a contract for Etuma's text analysis service (Feedback Categorizer / Insight Portal).  I had been in discussions with this particular customer for a number of months with the dialogue going something like this:

- Customer: We really like Etuma's ability to categorize and sentiment rate our transactional NPS Survey feedback. But, we know what we need to do to improve our NPS scores and we're investing heavily to make lots of those changes. So, we can't justify the time and effort to buy and use Etuma right now.  

- Me: I understand, resources are tight all around and taking them from things you know you need to do, to do things you may need to do can be a difficult sell.  Let's revisit this in a few months when your investment and resource commitments have stabilized a bit. I'll touch bases then.

I've known this particular customer for several years. So, instead of pushing what I knew is good argument for Etuma, I thought it better to just wait until things calmed down for him.  So, what changed?  Why did this customer reach back out to me and decide to acquire Etuma Text Analysis?

The answer of course is simple.  The boss (CFO in this instance) wanted some way of understanding if all the financial and other resources they were investing into "doing the things they need to do to improve" actually were helping them to improve. What happened was this company's NPS scores started rising, which they hoped / expected would happen. But, the question they couldn't answer easily was which of the operational changes they made were driving the change in NPS scores. With Etuma, the changes in feedback, both from a topic perspective and a sentiment perspective, are easily mapped to implementations of operational changes, giving managements a very rapid and correlated understanding of the effects their decisions are having on customers.

Fortunately, implementing Etuma is almost as easy as deciding to buy it. So, getting data on NPS and its changes is a fairly simple matter.  Its great to have another Etuma customer. Its even better to see that corporate management is willing to invest in methods for understanding if their actions are producing the results they seek. Better still for this company, the investment needed to aquire Etuma Text Analysis was very modest.

Stewart Nash
LinkedIn: www.linkedin.com/in/stewartnash

Try Etuma for free.  Click Here




Saturday, August 29, 2015

The Survey is Dead, Long Live the Survey - Part 2

Four years ago I wrote a post here titled: "The Survey is Dead - Long Live the Survey".  At the time over-surveying was a problem being talked about a great deal in the trade magazines and blogs. Response rates were declining across the market research industry. The argument then was that social media data streams would replace a lot of Surveys.

Fast forward to 2015. I think the exact opposite has occurred. Today, more people are being offered more surveys than ever before. And response rates appear to have stabilized or maybe even gone up.  In 2011, I said that: "social media may actually increase both the frequency as well as the value received from customer surveys". I believe this statement has been proved even more true than I had imagined it would be. But, its been for different reasons. Or, maybe for reasons related to those I had mentioned. I had thought that social media would force businesses to respond to feedback in more real-time ways. And, today many do so, mainly through on-line chat applications. But, almost all electronic methods of communication with businesses today offer some kind of feedback capability. Virtually all of them employ some kind of survey technology to collect and distribute the feedback.

So, In point of fact surveys today are more ubiquitous than ever before.  And the reason is simple, businesses need to know something about the people giving them feedback in order to act on it.  Also true is that transactional and even relationship surveys are even more widely used now than then. Again, for the reasons I had described. They simply provide businesses with too much valuable, actionable feedback (at a very low cost) to stop being a core feedback management tool. What has changed is that surveys are being better designed and are a lot more respectful of people's time than in the past. So more people are willing to take them.

Of course survey technology is improving. Today, many web surveys are Mobile Device Enabled (MDE). MDE surveys appear to generate higher response rates. In my opinion, because people can use their smartphones on the subway, train or bus (or anywhere else that's connected to the internet) to take surveys. Also SMS text messaging can be used to deliver a survey to mobile users, meaning people don't even need e-mail accounts to be reachable anymore. Survey links are embedded in social media now all the time, further extending their potential reach. As a result, survey-able populations are larger. And, with larger populations and higher response rates, its no surprise that more surveying than ever is going on.

A plug for QuestBack.  All QuestBack Essentials surveys are fully MDE. They can be delivered via URL, Email, QR Code, Pop Up Script, embedded in social media or even sent via SMS text.

Surveys aren't dead at all.  They are more alive than ever before, used in more places and for more purposes than ever before and providing more value to businesses than ever before.  Long Live the Survey!

Stewart Nash
LinkedIn: www.linkedin.com/in/stewartnash




Tuesday, August 25, 2015

Surveying in support of unsolicited verbatim feedback

I've written a number of posts her focusing on ways to employ outbound processes for soliciting customer feedback. But, by definition, outbound feedback collection requires enough customer knowledge to determine which customers to engage, when to engage them and what topics to try and engage them on. Some of that data can be drawn from CRM databases. But, what about when no solid information exists about a customer?  

In my most recent post I discussed how text analytics can be employed to categorize verbatim feedback from transactional surveys and how this categorized data might then be used to trigger follow on topic driven surveys. This post is going to elaborate on that topic and extend into other arenas where verbatim customer feedback can be analyzed, categorized and potentially used for input to survey processes. 

One important type of unsolicited feedback companies receive is from social media and company websites, where customers visit without being asked to, or are responding to some level of marketing. The feedback businesses get from these sources can be difficult to take action on, unless it is reviewed at the actual comment level. This, of course, takes lots of resource. Text analysis tools are a logical mechanism to employ so as to reduce the manpower needed for follow up. Since topical categorization of social media based feedback is a simple matter today for text analysis systems, using them to create feedback categories is a logical step. However, most verbatim feedback - taken without context of customer data is also hard to act on.  So, surveying subsets of customers based on their topic and sentiment profiles also makes logical sense.  The same way drill down surveys can help with simple transactional feedback (last months post), they can help with understanding unsolicited feedback. 

It seems to me that getting survey questions out to people interacting with social media or survey like data into social data streams would clearly add value to the data. An example of how this might work:
- "Business A" had 10,000 comments come in last month through their web site, where they offered a feedback form with a single NPS question and a verbatim comment box. The 10K comments were run through text analysis. 1000 comments were categorized as referencing "product quality". And, of those, 200 were from "detractors" and 200 from "neutrals". 
- In situations like this, the product management (PM) team would want to try and ascertain if product quality was in fact a driver of NPS score. By sending a follow up survey to this specific group of detractors and neutrals regarding products, company A's PM team can generate further and more detailed data about any product quality issues customers are experiencing. 

The benefit of a process like the one I'm describing above is that it eliminates the need for a person to go through all the "product quality" related feedback to generate the needed information about product quality issues. People taking the follow up survey would provide quantitative data about specifics of product quality. And, the qualitative data in the text would then make a lot more sense. Reacting to feedback using a process like this would help businesses react to the right issues, as well as not over react to non issues.

With more and more customer feedback being unsolicited in nature, its important for businesses to begin thinking about ways to attach specific quantitative data to the streams of text entering their processes.  Adding surveys to text analyses based on results, can help.

Stewart Nash
www.linkedin.com/in/stewartnash









Sunday, July 19, 2015

Should verbatim analysis drive customer survey processes?




Traditionally, verbatim analysis techniques and tools are applied in post survey analyses of customer feedback.  But, with the increasing prevalence of very brief transactional surveys and social media channels generally, I'm wondering if the process should be reversed.  i.e. Analyze verbatim comments to understand where deeper dive kinds of surveys should be deployed.

I'm seeing an increasing number of companies these days that use a very simple transactional Net Promoter or CSAT survey process. Typically, these firms are asking a single question: "How likely are you to recommend [company] to your family and friends?", accompanied by a comment box. Or, "Rate your satisfaction with [company]", accompanied by a comment box. It seems to me that there are advantages to this kind of process including:
  • Higher response rates
  • Really simple and fast implementation of the survey process itself
  • Low cost 
It would also seem that there are some disadvantages to this type of process, including, among others:
  • Figuring out who should do follow up on received feedback.  
  • Figuring out how to appropriately follow up on received feedback.
  • The challenge of analyzing feedback from surveys where no questions exist about key aspects of the company's offerings (examples being product / service quality or value vs. competition).
For companies early in their existence, simple easy and low cost are important characteristics for a customer feedback program. But, as they grow, understanding the drivers of loyalty or satisfaction become more important in successfully operating the business. This leads to growing companies using more and more manual resources for feedback analysis, which of course costs money. The way it works: At first, one person is assigned to read, categorize and action issues identified in comments in surveys. When comment volume gets more significant they expand to a team of people. And, so on.  

Once multiple people begin analyzing verbatims individually, interpretation issues begin to creep into the analyses.  The more people categorizing text the more interpretation issues there will be. Automated text analysis tools are a good next step for customer feedback evaluation when teams of people begin getting involved. They reduce labor costs while improving the quality of the analyses. 

Even automated and consistent text analytics isn't sufficient to fully understand the voice of the customer (VOC). Additional quantitative data is also necessary. So, some level of follow-up survey process needs to be implemented. In fact, potentially several separate follow surveys might be used depending on what the verbatim feedback is indicating. For instance, if product or service issues are highlighted by people responding to the survey, a follow up survey asking a few more questions (less than 10 ideally) should be sent in order to better understand what those issues might be.

A number of companies I talk with these days have this scenario where they're receiving lots of multi-channel verbatim feedback, beyond that contained in their transactional surveys. Twitter feeds, customer service chats, feedback forms on the websites, etc., all offer text analysis opportunities and will provide indicators for follow up, deeper dive surveys. Companies should look to Text Analysis tools to standardize their verbatim feedback analyses. But, they also should look for ways to use the verbatim analysis results to drive follow up surveys that lead to a higher level of VOC insight.

Transactional feedback is a key element to any company's customer experience strategy.  But, understanding the customer experience is more involved and requires more information about customers than simple transactional surveys can provide. Combining them with text analytics and targeted follow up surveys can provide that richer set of insights.

- Stewart Nash
QuestBack USA
Etuma USA
LinkedIn: www.linkedin.com/in/stewartnash

Sunday, June 14, 2015

"Ask and Engage" with Customer Surveys

Feedback Action Loop
Early on in its history (2001?), QuestBack coined the term "Ask and Act" to describe how its web survey solutions provide companies with an easy mechanism to follow up on customer feedback. But, with only a rudimentary e-mail capability, QuestBack's utility was largely limited to specific feedback projects, typically, research or customer service oriented.  That has all recently changed.......

Feedback Based Engagement Loop
Fast forward to today.  QuestBack should be renamed to "Ask and Engage". Its html based e-mail tools and media libraries allow marketers to build powerful messaging programs that present brands or introduce products / services, all while collecting data, qualifying or generating leads and prospects, or engaging with customers about various aspects of the company's relationship with them. "Rules" embedded in QuestBack surveys that accompany marketing messages let companies design engagement to maximize the business value from each contact they touch with QuestBack.

So why would a QuestBack based customer engagement program be better than say lead nurturing using e-mail tools? A couple of reasons, rules based follow up means that a "rule" determines what kind of engagement should occur. Should a contact's response profile indicate a certain type of engagement, that can be embedded as a QuestBack follow up "rule" which generates a follow action automatically.  Rules can also be constructed, using company data, so that the person who "owns" a particular kind of engagement with a particular class of lead, prospect or customer can be automated into an engagement process.

Since QuestBack operates in social media platforms like Facebook or on company websites, engagement doesn't have to be just outbound e-mail driven. Done well, and using closed-loop follow up, surveys themselves are engagement tools. In fact, they are typically very good engagement tools.  Even when going out in pure marketing blasts, 1-2% response rates are achievable. When going to customer bases, qualified leads or other folks with a connection, response rates can be north of 30%, a rate e-mail marketers would kill for.

"Ask and Engage" should be an approach marketers start to adopt more and more.