Monday, February 28, 2011

Feedback Enhanced Selling

Improving Sales Processes using Pre-Lead, Lead and Prospect feedback

As a career business development guy, I've always been intrigued by the potential of feedback enhanced selling.  And, in particular since I began representing QuestBack, a feedback management tool.  Yet even for me, until recently it seemed like collecting feedback from leads and prospects was just as easily done via direct e-mails or phone calls.  After all, what else are sales people supposed to do except directly communicate with leads, prospects and customers? 

When I look at sales jobs today, often the expectation is for sales people to contact 100 people per day by phone.  Now, no one expects anyone to have conversations with 100 people daily.  But these firms want their sales folks to dial 100 times a day in order to have maybe five meaningful conversations.  Every time I see this it bothers me.  It would appear that companies are willing to pay people to engage in behavior that results in essentially a 95% failure rate.  And, with buyers not answering their phones at all, will the number of calls be 200 a day next year or 500 five years from now?  I don't know.  But ten years ago expectations were for sales people to contact 50 leads per day.  So, the number is rising and unless a new process is found, its likely to continue to do so.

So the question is:  Is there a way to get from 5 conversations in a day to 6, 7 or 8?  I think possibly there is.  Using web or e-mail based surveys may offer a solution.  Some reasons leads don't engage with sales people are: Timing is wrong (no plans to buy for awhile), Solution "fit" is bad, or more preferable alternatives are available and "on the table".  Hence, no desire to talk to your sales people and no contact is made.  If done well lead feedback will help sales reps with timing their calls to contacts, with understanding the contacts needs and with avoiding contacting leads where no fit exists.  Result, less stressed sales reps, more targeted calling, more prospects in the funnel, happier and more qualified leads and prospects.  All good stuff.

Some thoughts on when to gather feedback from leads or prospects:

  • During website visits.  Surveys should be offered to visitors regarding the information being presented.  Idea is to identify if the information provided is sufficient and helpful or if they want more or even a contact from you.
  • After a lead enters ther sales funnel.  Survey should be done to ascertain sales cycle stage, needs analysis and timeline perception. Idea is to gauge interest level, understand budgets, determine competition and understand perception of your company's offering.
  • After a sales person has made direct contact with a lead.  Idea is to determine if the lead learned what they needed from you to move forward.  And, did his perception of you improve or deteriorate?
  • After an inside sales representative designates a lead as a prospect.  Idea is to validate the action and learn what the new prospect perceives about your offering that causes him to be interested in additional dialogue.
  • After a "key account manager" has been in contact with a prospect.  Idea is to validate that the prospect sees the product fit and sales cycle in the same way your KAM has reported it.  Collecting feedback 30, 60 or 90 days after a relationship has been started should help with understanding the possibiltiies for a sale occurring.
Technology already exists to automate a great deal of this kind of effort. QuestBack has even built a series of templates to use for automating feedback collection from leads and prospects during the sales cycle.  An example of QuestBack's sales process feedback approach can be found here: 

http://www.questback.com/areas-of-use/private-sector/lead-generation/

Tuesday, February 22, 2011

A Repost of a Good Net Promoter discussion with pros and cons...

I've written in the past about the Net Promoter methodology for measuring customer satisfaction and loyalty.  If you've read any of my Blog posts, you'll realize that I'm a big fan of net promoter. And, I frequently recommend its use to my customers.

Earlier today I was browsing a LinkedIn group where I'm a member, when I came across this post (which was a repost) of an article originally published in MarketingWeek magazine. 

I thought the article gave a pretty good account of the Pros and Cons associated with Net Promoter and its use.  So thought I'd share it here.  Enjoy.  The url is:

http://www.linkedin.com/news?viewArticle=&articleID=374834583&gid=1772348&type=member&item=44620162&articleURL=http%3A%2F%2Fwww%2Emarketingweek%2Eco%2Euk%2Fanalysis%2Ffeatures%2Fhow-to-get-more-from-your-score%2F3023551%2Earticle&urlhash=iHgy&goback=%2Egmp_1772348%2Egde_1772348_member_44620162

Thursday, February 3, 2011

Customer Surveys - Better results by Linking Responses with Business Data

In my last Blog post I talk about some things not to do in B2B customer surveys.  A big one for me is asking questions where the answer is already known to your company.  Business data is a huge area of customer questioning that fits the "Don't Ask" profile. Some examples of survey questions I've seen where business data is asked for:

- "How long have you been a customer?"
- "What region are you located in?"
- "Please select the [Company] products you have purchased or use?"
- "When did you make your last purchase from us?"

When asking this kind of question, the surveyor is taking a "short cut" by having the customer fill in or validate data that exists in the company's databases.   

The solution is to create linkage between customer surveys and business data.  By doing so, it is possible to avoid asking for business data from your customers and focus can be placed on the actual information that is desired.  Fortunately, it's usually fairly easy to get business data for the customers you plan to survey.  And, by organizing your survey process to take advantage of business data, your surveys can be shorter yet still  effective from an insight perspective while being considerate of customer time.  

Four techniques for incorporating business data into customer surveys

Needless to say, linking survey responses to business data is not a new challenge.  And, companies often go to great lengths to do it.  Larger firms often integrate customer surveys with other systems, so that the information derived from surveys becomes part of the customer record and can then be extracted, aggregated and analyzed along with other customer data via the company's normal reporting mechanism.  But, even in this scenario, data gathering via surveys that are not integrated with other systems (because I/T has to do it on a survey by survey basis) still suffer from the linkage challenge.  So, companies have evolved  four techniques to link business data with customer survey responses.

1. Brute force. In this popular approach (often deployed when using low-end survey tools) a customer list, in a spreadsheet, is developed where business data is included for each individual to be surveyed.  An identifier is "coded" to each individual customer in the spreadsheet. Each e-mail to be sent is also coded with the same identifier (hopefully) to match the data in the spreadsheet.  Most survey tools allow this kind of coding and will "kick" out resulting survey response data in a spreadsheet file. If coded correctly matching up survey responses with business data later is pretty easy.  Reporting can then be done in via spreadsheets or other data manipulation tools.   The downside is that this is a time consuming and error prone process.  Data has to assembled, coding assigned, used properly, re-matched after the survey, then handed off to a spreadsheet guru to generate the analyses and reports.  The time investment often outweighs the cost savings of using a low-end survey tool.

2. The multiple mailing method. This is similar and slightly more sophisticated approach to the brute force method. Instead of coding respondents and then matching responses later, in this method the spreadsheet is filtered in advance by the needed business data.  Each filtered subset is then sent the survey. When responses come in, you already know that batch #1 is from "RegionA large customers", Batch#2 is from "Government accounts in California", Etc. Survey responses organized and sent this way are easy to interpret and report on but hard to do subsequent analyses on.  Needless to say, this method is also somewhat cumbersome in that several or possibly many e-mailings must be set up and scheduled.  And, care must be taken to ensure that no individuals are in multiple subsets (or they'll receive multiple survey invitations).

3. The Customer Panel.  In this method business data is stored within the survey tool in a "panel" (a separate database).  Surveys are sent to panel members and responses are automatically tagged with the information stored about them in the panel.  This is a good approach generally, its only real flaw is that the panel needs to be refreshed or updated periodically so that its business data is relevant.  A second potential flaw is that the survey tools with built in panel support are often at the high end of the market or panel support is an extra cost feature of the tool.

4. Pre-load business data into the survey. In this method business data and customer names are loaded into the survey tool, the survey is designed and e-mailed out.  When responses are received they are already tagged with business data and responses are filterable based on the business criterion loaded to the survey.  Slicing the data becomes fast, easy and fairly painless. If the survey tool has good analytics and reporting tools this approach can save lots of time and provide immediately actionable data for follow up action taking.  It also doesn't require panels, integration with CRM or other systems, spreadsheet gurus for data analysis, and its not subject to data matching errors post survey.

I view method#4 as having the best combination of affordability, flexibility, analytics and reporting power, time conservation for customers and time conservation for the surveying company.   I am not aware of many survey tools that support method#4.  QuestBack is a feedback management system that does.

Wednesday, January 26, 2011

My 7 "Rules" For B2B Customer Surveys


Being in the business of selling, supporting and using web survey tools mostly for supporting customer surveys, I’ve seen hundreds of them. I’ve found that customer surveys and particularly B2B surveys need to be clear, concise and sensitive to the customer’s time. So, I’ve developed a set of “rules” that I try to follow when helping my customers with their customer surveys.

Rule #1 – Clearly define the objective for the survey

Many customer surveys I've run across have multiple (and sometimes conflicting) objectives. This happens a lot to businesses that survey their customers just once a year. In this situation, the customer survey ends up being everybody's vehicle for capturing "something". In the end it often results in too many questions, low response and completion rates, unclear follow-up actions and ultimately, annoyed customers who don't want to take your surveys.

So, I always try to define the “one critical data point” the survey must develop. And, only ask questions that provide qualification to that one data point. If that task can be achieved with less than 10 questions, secondary – but related - topics can be introduced.

A survey methodology that does this well is the net promoter method developed by Fred Reicheld. What I like about net promoter is it's based on one question - "How likely are you to recommend [company] to your family and friends". Other questions in a net promoter survey are designed to qualify responses to the "recommend" question. Done well, net promoter surveys are short and take less than a couple of minutes, while also providing insights, clearly interpretable data and clear follow up actions that have a purpose (enhance relationship quality).

Rule #2: Survey customers regularly.

It may seem counterintuitive where people complain about getting too many surveys. Yet, I think people mainly complain about "bad" surveys (i.e. too long, irrelevant questions, etc.). If you get a reputation as a "bad" surveyor, your response rates will be low. And, "opt outs" will be high. But, if your customer surveys are well done, i.e. short, easy to complete and always followed-up, your customers will interact with you in relatively high numbers even in quarterly surveys.

Rule #3: Always keep surveys short.

Personally, I don’t like to ever ask more than 10 – 15 questions in a customer survey.  And I try to avoid long matrix type questions. If you are using a regular survey process, after asking your 10 core questions you can ask some secondary related questions.  In a quarterly feedback process different sets of secondary questions can be rotated into the questionnaire.

Rule #4: Only send surveys to those people who can give you the data you need.

Blasting a survey to people who can’t give you the information you need just guarantees that you’ll annoy most of them.

Rule #5: Always act on each customer's survey response at an individual level

Acting on survey responses is critical. Customers take their valuable time to give you feedback. You have an obligation to tell them what, if anything, you are doing in response to it. Not following up tells customers there’s no point to giving you future feedback.

Rule #6: Avoid irrelevant questions at all costs.

A common mistake: Asking a purchasing contact to evaluate a product's technical capabilities. It happens all the time, but shouldn’t. And, it’s to be avoided wherever possible. It makes you look bad when it happens. After all, a customer almost always has to provide some level of documentation about who he is, where he’s located, what he does for the company, etc. If multiple roles are being surveyed, use question branching, data piping or question routing to only present relevant questions to those people. Most good feedback management systems support question branching, routing and piping. So, there’s no reason not to use those capabilities.

Rule #7: Never ask a customer a question you already have the answer to.

It amazes me how often I’m asked in surveys for data that I’ve supplied to the company many times in the past. I hate doing it each and every time. I feel that whoever built the survey was too lazy to look up relevant information about me. And worse, was willing to waste my time getting the information again. So, when a survey crosses my desk where I’m asked “dumb” questions like what’s my name, what products do I have, how long I’ve been a customer or the like, I simply terminate the survey.

More importantly, it's not considerate of a customer's time to ask questions you should have the answers to. Think about this. You are asking your customer to use his time to give you information that you could have used your time to develop.

I help businesses with customer surveys all the time.  If you'd like help with a survey project you're planning, please feel free to ask me for advice.  

My e-mail is: stew.nash2010@gmail.com
I use QuestBack for my survey projects. 



Tuesday, January 11, 2011

Action Management & Tracking Survey Results Available

I've been running a non-scientific survey (see sidebar) targeted at people who use customer feedback processes and also CRM systems. Additionally, I've been collecting information via LinkedIn posts and conversations with serious practitioners of Voice of the Customer (VOC) processes.  To get survey results just click here: http://www.nash-efm-consult.com/Action_Tracking_and_Management_Survey_Results.pdf

I've found that most companies that employ feedback management systems and a case management approach have two challenges. One is verifying that follow-up occurred - and was in some way effective (i.e. contributory to boosting NPS, CSAT or other metric). Second, is evaluating at a higher level the types of issues that are being responded to by front line people and, of course, creating strategy or higher level actions (and then communicating them) that are the "real" organizational response.

I found that a large minority of organizations (48%) don't get much value from their Alert processes. This finding bothered me because 52% report receiveing significant value and 75% report that Action Management & Tracking are very important to the success of their customer feedback initiatives.  So, I think this number shows one of a variety of possible issues, including (but not limited to):
  • Taking follow up actions is too resource intensive to justify
  • Follow up actions aren't perceived as effective (possibly because people aren't empowered to fix issues), so aren't being pursued.
  • The Action Management process isn't well supported by tools (CFM tools un-integrated with CRM?)
  • Others?
And another issue that I think should get looked into, but I don't think often does is: Staff perceptions of their follow up tasking. i.e. Do Account Managers and Support people provide:

- mostly lip service when doing survey follow up
- solve real issues that benefit the business
- feel that they are provided with the tools and resources to "add value" for customers

I think companies need a metric that reflects the employees perception of the enterprise's effectiveness at dealing with / acting on customer feedback. I think it could be quite useful to be able to gauge that sentiment on feedback "response effectiveness" as compared to CSAT or NPS.

Some of the issues with Action Management and Tracking can be dealt with by using surveys to revisit feedback customers provided 30 or 60 days earlier.  And, by surveying the employees periodically (quarterly?) on "response effectiveness" of their actions.