Tuesday, January 15, 2013

Organizing on-line surveys for action taking

People love to give feedback.  But, hate to take surveys. 

I see lots of surveys that start out with "Please give us feedback" or "your opinion is important to us" followed by "click here to take our short survey".  After clicking I get something that indicates the survey will take anywhere from 10-25 minutes to complete".  Like most people I immediately quit the survey.  In my mind, any time commitment beyond 5 minutes makes me a research subject, which I didn't sign up for when I clicked on the survey link.

Because most surveys are designed to collect data and employ longer questionnaires, lots of customer feedback doesn't get collected by companies that want and need it.  Surprisingly to me, even many surveys asking the Net Promoter (likely to recommend) question aren't designed to generate immediate follow up actions. 

What companies should be doing is designing surveys that customers want to take and that have built-in triggering mechanisms for enabling responses to feedback.  In my experience, key constituencies want good relationships and will take time to give feedback when asked, provided the process is respectful of their time and provides a value add (better relationship) for them.  These things become easily achievable with short, follow-up supported surveys.

Feedback surveys should use short questionnaires with 10 or less questions (presented) and a 2-5 minute maximum time commitment.  They should only ask questions that are meaningful to the customer relationship (one reason I like Net Promoter).  They should never ask for information that is already in their databases somewhere.  And, they should always have a follow up process for everyone who takes the survey.  Even if that follow up process comes later on and is general in nature.

How questions are asked should also be taken into consideration when doing feedback surveys.   Here's an example of a typical product oriented satisfaction question done to collect data:


Here is the same question designed for feedback:


On the surface these two examples look very similar.  Except here the question is followed by a more specific question based upon the answer chosen.  If the answer chosen is:  Very dissatisfied, Somewhat dissatisfied or Neither dissatisfied or satisfied, the customer gets:

You indicated you are less than satisfied with ACME Company Product XYZ please tell us why.  We will contact you shortly by e-mail to follow up.


If the customer indicates Somewhat or Very Satisfied they get:

Please tell us what you like best about ACME Company Product XYZ.   We will contact you shortly by e-mail to follow up.

In addition to triggering a question branch, in the above example, each set of answer alternatives triggers an alert or notification to someone to take follow up action.  In QuestBack we generate an e-mail to a designated person.  In other feedback management systems (and also in QuestBack if needed) triggering is done through a CRM system.  But, in either case, the survey is optimized for feedback because no response or branching is triggered if Not Applicable is selected, and specific questions and triggering are set based on specific answer alternatives. 

Good feedback oriented survey processes should have at least a couple of trigger questions.  One for Loyalty or Advocacy, One for general satisfaction or experience, and possibly one or more based on product or service attributes that can be boiled down to some actionable response.

Surveying key constituencies with a goal of creating dialogue vs. data is a trend not to be ignored.  As people get more mobile and more "Social", surveys will have to be more feedback oriented. And, designing surveys for follow up action is a great way to collect feedback, increase customer dialogue and ultimately build better and more persistent customer relationships.