Thursday, February 14, 2013

Lowering "Break Even" for justifying Text Analytics

In a world where most businesses doing customer and employee types of feedback still "code" their verbatim survey responses and other text feedback manually, the standard for "break even" on automated text analysis solutions generally seems to be on the order of ten thousand 10,000 text items per month.

Why is 10,000 the number?  In my experience, I've seen many instances where smaller volumes would justify investment in an automated solution.  Yet, to a large degree only very large businesses and government agencies with big flows of  text based feedback have adopted automated text analysis solutions. 

I think there are two reasons for this.  First is the price of the automated text analysis solutions, which typically have minimum annual costs of $100,000 per year.  So, for a business with lots of text to be coded, only when "people costs" exceed $100k/year does it make sense to invest in an automated solution.  The second reason is that people rarely do just verbatim coding in businesses today. Typically groups of people do the work in different departments as part of their regular jobs (VOC Analyst, Market Research manager, etc.). There's often no single FTE that can be "replaced" by an automated solution.  Only when the volumes of feedback become so large as to be overwhelming do businesses consider automating the analysis process.  By then, the costs of manual coding are large and they justify a large investment.

But, what would happen if the annual software cost of an automated text analysis solution could be lowered to $50,000 per year?  I think that the potential market for automated text analysis would become exponentially larger.  After all, in most businesses its a lot easier to find half of an FTE doing text analysis manually than it is to find full FTEs manually doing text analysis. 

In my opinion, there are additional reasons to consider automating text analysis at lower levels of feedback than 10K per month.  Just one is the ability of an automated solution to identify new topics.  As someone who does a number of feedback projects that employ survey based open answer questions, I regularly evaluate verbatim responses both manually and via automation.  Whenever I've used etuma360 I've found that the etuma analysis identifies topics which I had not considered based on my manual inspection process.  And, since people doing manual coding have a propensity to map all the incoming feedback using the existing coding structure and categories, manual processes will tend to miss new topics.  Automated solutions will typically pick up the new topics.  Valuing this capability is difficult though.  But, its something to consider when looking at text analysis and its cost benefit.

Etuma has a number of pricing plans that allow businesses to get into automated text analysis for less than the $100K/Year price point.  I would think that anyone with 2,000 pieces of text feedback per month would be candidates for an etuma360 implementation based on FTE considerations alone.

Tuesday, February 5, 2013

Surveying for Feedback/Response Action Management

Periodically I see discussions in articles and LinkedIn forums about the "Death of Surveys".  But, in my view, the on-line survey business is simply transforming from a focus on surveying for data collection to one of surveying for feedback and response action management (F/RAM).  This is particularly true, I think, in the case of relationship surveys (customer, partner, employee, alumni, union member, donor or "membership" types of surveys).  In short, where "relationships" exist between an entity and a population of people, something more than data collection is now necessary. 

In my view, surveying for relationship management purposes is occurring more today because of the growth of social media, on-line chat and mobile device technologies, all of which help businesses collect huge amounts of customer data. So much so, that businesses are almost overwhelmed by it. It's not a coincidence that data analysis, "big" data and data storage vendors are doing well.  All that data needs to be analyzed, correlated, cross referenced and stored.  Yet none of it really triggers businesses to build better relationships with the people they interact with.  Somewhere and some how, somebody has to ask customers how they feel in order to assess relationship quality.  If a business has lots of customers, a feedback/response action management survey is the best way to do that, because the feedback automatically propagates dialogue in a F/RAM process.

Feedback/response action management is a process that many businesses are unfamiliar with.  Its a fair bit more complex than traditional market research.  It relies on customer data to guide how response action management should be implemented and it necessitates the use of a methodological approach (NPS, CSAT, CxM or something similar).  In addition, F/RAM requires that feedback scenarios be modeled or at least thought through, so that appropriate responses can be formulated (i.e. who responds and how when a customer - from country x, with product y and issue z triggers a response action based on their survey feedback). 

A number of on-line survey platforms today can implement an F/RAM process.  Some of the platforms though are expensive to acquire.  My admittedly incomplete list of F/RAM capable tool sets includes: QuestBack (all platforms), Vovici, Medallia, Allegiance and ConFirmIT.  ClickTools and KeySurveys to my understanding only implement F/Ram processes through CRM integration (and ClickTools only for SalesForce). In my experience, almost all the other tools  "out there" are primarily focused on just data collection and analysis.

In my experience there are two critical capabilities that a tool needs to have in order to implement an F/RAM process.  First a tool needs to be able to trigger a real-time follow up action based on a survey response, customer data or a combination of both.  Second, a tool has to be able to link, in real-time, customer data to the survey at a respondent level.  Without these two capabilities F/RAM processes require lots of I/T intervention in order to get survey responses to trigger actions at a respondent level.




 

Tuesday, January 15, 2013

Organizing on-line surveys for action taking

People love to give feedback.  But, hate to take surveys. 

I see lots of surveys that start out with "Please give us feedback" or "your opinion is important to us" followed by "click here to take our short survey".  After clicking I get something that indicates the survey will take anywhere from 10-25 minutes to complete".  Like most people I immediately quit the survey.  In my mind, any time commitment beyond 5 minutes makes me a research subject, which I didn't sign up for when I clicked on the survey link.

Because most surveys are designed to collect data and employ longer questionnaires, lots of customer feedback doesn't get collected by companies that want and need it.  Surprisingly to me, even many surveys asking the Net Promoter (likely to recommend) question aren't designed to generate immediate follow up actions. 

What companies should be doing is designing surveys that customers want to take and that have built-in triggering mechanisms for enabling responses to feedback.  In my experience, key constituencies want good relationships and will take time to give feedback when asked, provided the process is respectful of their time and provides a value add (better relationship) for them.  These things become easily achievable with short, follow-up supported surveys.

Feedback surveys should use short questionnaires with 10 or less questions (presented) and a 2-5 minute maximum time commitment.  They should only ask questions that are meaningful to the customer relationship (one reason I like Net Promoter).  They should never ask for information that is already in their databases somewhere.  And, they should always have a follow up process for everyone who takes the survey.  Even if that follow up process comes later on and is general in nature.

How questions are asked should also be taken into consideration when doing feedback surveys.   Here's an example of a typical product oriented satisfaction question done to collect data:


Here is the same question designed for feedback:


On the surface these two examples look very similar.  Except here the question is followed by a more specific question based upon the answer chosen.  If the answer chosen is:  Very dissatisfied, Somewhat dissatisfied or Neither dissatisfied or satisfied, the customer gets:

You indicated you are less than satisfied with ACME Company Product XYZ please tell us why.  We will contact you shortly by e-mail to follow up.


If the customer indicates Somewhat or Very Satisfied they get:

Please tell us what you like best about ACME Company Product XYZ.   We will contact you shortly by e-mail to follow up.

In addition to triggering a question branch, in the above example, each set of answer alternatives triggers an alert or notification to someone to take follow up action.  In QuestBack we generate an e-mail to a designated person.  In other feedback management systems (and also in QuestBack if needed) triggering is done through a CRM system.  But, in either case, the survey is optimized for feedback because no response or branching is triggered if Not Applicable is selected, and specific questions and triggering are set based on specific answer alternatives. 

Good feedback oriented survey processes should have at least a couple of trigger questions.  One for Loyalty or Advocacy, One for general satisfaction or experience, and possibly one or more based on product or service attributes that can be boiled down to some actionable response.

Surveying key constituencies with a goal of creating dialogue vs. data is a trend not to be ignored.  As people get more mobile and more "Social", surveys will have to be more feedback oriented. And, designing surveys for follow up action is a great way to collect feedback, increase customer dialogue and ultimately build better and more persistent customer relationships.
 

Wednesday, November 14, 2012

Text Analysis Makes Surveys Better

Text analytics really is becoming "Good enough" to change the way we collect feedback.



Until recently, Enterprise Feedback Management systems have been mainly web-survey based technology without integrated text analysis capability. Text analysis vendors emerged and made claims that their analytics could replace customer surveys. I've argued that Text Analysis alone couldn't provide the depth of insight and ability to formulate actions that a well designed survey provides. Yet, I've also made the case for text analysis as a means to provide qualitative context to survey results. And, that it therefore is a useful tool for helping to manage customer feedback. 
 
Societal and technology changes, particularly the time pressures people face, the increased complexity of daily life and the ubiquity of mobile devices with internet access, I believe, are forcing feedback professionals to consider alternatives to exhaustive (and lengthy) customer surveys. Survey response rates have been in decline. And, companies have reacted by moving to shorter surveys, making up for the lack of survey insights by doing more with analytics of all kinds. Concurrently, text analysis technology has become more capable. This has enabled the increased use of open answer questions in surveys, typically replacing multiple attribute oriented questions with single open answer questions. 
 
Trends and technology, I believe, now make Text Analysis a key component in any larger effort aimed at managing customer feedback. That said, I still think text analysis is most effective when applied to survey based verbatim text. Readers of this blog know that I use Etuma360 (www.etuma.com). And, having used it to evaluate survey verbatim text on several data sets, I’m now of the opinion that text analysis has the potential to substantially change how we do customer feedback.
 
For instance, in most customer surveys, we try to carefully design question sets that help customers tell us about various attributes of our product or services offerings. For firms with lots of products or services this often makes for cumbersome and complex customer survey projects with lots of back-end analysis work needed to get to actionable results. Needless to say customers today don't want to spend the time required answering all the questions we want to have answered. And, as importantly, executives today don't want to spend money on surveys that may take weeks or months to garner insights from. 
 
So, what to do? Obviously, without customer feedback data there can't be any actionable customer insights. We need an approach to customer feedback that is both powerful, yet concise. Text analysis helps us get to that place.

Figure 1 - Example Etuma360 Topic with Sentiment list


The chart above comes from verbatim text responses in a survey (using the net promoter question) I did for a local soccer club. The text analyzed came from a question asking for “any additional feedback” the customer wanted to supply, at the end of the survey. The survey asked specific questions regarding product / service attributes. In this case, asking about things like coaching, communication, venues, etc. When I examine the topics identified via text analysis, they look a lot like the topics we identified as needing survey question based input. If I had designed the survey to ask for open answer feedback specific to customer’s experiences, I think the text analysis results would have been even more closely aligned with the product/service attributes we were interested in.

Text analysis tools also provide additional analysis capability (Etuma360 does anyway). For instance, Etuma has the ability to compare topic / sentiment for different groups within the survey using background variables to filter open answers. This generates data that closely aligns with loyalty drivers and provides a measure of each topic’s relative value to selected customer subsets. For companies with customer behavior data (revenue tier, tenure, etc.) embedded as background variables in their surveys, even more granular insights can be generated. The example below shows a topic comparison of the data presented in Figure 1, filtered by “Promoters”. Getting this data was easy using Etuma360 in combination with QuestBack. More importantly, these insights can generate in real time, without weeks of analysis work, so executives can see and act on them quickly. 

Figure 2- Example Etuma360 based Topic / Sentiment Distribution filtered for “Promoters”


Companies today have to deal with the reality that their customers want their time respected and their feedback to be heard.  Asking for 5 or 10 minutes to collect feedback a couple times per year is about all they are willing or able to give.  Using EFM systems that employ web surveys and background data in conjunction with text analysis helps maximize action taking and insight discovery while minimizing the time commitment required of the customer.  

This seems like a combination that most companies want.  With QuestBack and Etuma, it seems to me that they can get it at a very reasonable cost.

If you are interested in trying Etuma360 they have a free trial program just Click Here.  QuestBack does too.  E-mail me if interested.

Stewart Nash
stew.nash2010@gmail.com
www.linkedin.com/in/stewartnash .
 



 

Friday, October 12, 2012

Etuma Text Analytics in Action

I've posted a couple of times over the last several months about Etuma360 and it's unique combination of powerful analytics, ease of use and low cost.  But, like many new applications that deal with often proprietary customer data, Etuma360 is hard to demonstrate as customers are typically unwilling to let their data be seen by people outside the company. 

Interestingly, Etuma has come up with an application that is entirely open to the world and which is  evaluating text inputs on a real-time basis.  The US Presidential Election.  Etuma has linked into the presidential campaign's Facebook pages for both President Obama and Presidential challenger Mitt Romney.  They have created web site that displays the results of their data analysis.  It's called "The BaraMitter".  The website is:  www.baramitter.com.

The website shows topic and sentiment analysis running for both campaign's FaceBook pages.  It's pretty interesting.  And, it's a good way to see an example of how Etuma360 can work using Facebook data feeds.  In addition, the website offers commentary about the analysis and advice on how to interpret the insights being displayed.  Fortunately, no political analysis is offered.

I encourage readers to take a look. 

If you are interested in trying Etuma360 they have a free trial program just Click Here

Stewart Nash
stew.nash2010@gmail.com
www.linkedin.com/in/stewartnash