Friday, November 14, 2014

What's the next consolidation in EFM?

There's been a large recent influx of new capital going into EFM vendors, either through private equity, or merger and sometimes by both merger and additional equity. So, I thought I'd revisit my thinking on the consolidation occurring in EFM.

All technology markets undergo consolidation over time. Normally, it occurs as the technology itself (whatever it is) grows and vendors adapt to that growth. This has been steadily happening in the EFM space.

A few months ago I wrote about this in a post titled: "More Consolidation in the EFM space." It outlined the merger and acquisition activity going on with EFM. The discussion was about the business consolidations by Verint (Vovici), ConfirmIT (CustomerSat), SurveyMonkey (CustomerSat and ClickTools) and others, including QuestBack. The post is available here:

At the time, I had three thoughts about the merger activity.....

"Thought #1.  The EFM space is getting crowded at the top end of the market.  Gartner, Forrester, Aberdeen and others have been beating the drum for Enterprise Feedback Management for a long time, at least five years by my count.  Since most of the EFM vendors focus their efforts on selling to "Enterprise" class customers, as time has passed more of them have adopted solutions, leaving less growth available to the rest of the players.  Hence, acquisitions of high-end products."

Clearly, the EFM space remains quite competitive at the top end of the market. So much so in fact that EFM vendors are competing with market research consultants, who sit at the top echelon of the market and act as strategic vendors to large businesses. ConfirmIT / CustomerSat and even QuestBack / Global Park seem to be pushing into the project space of traditional market research consultants. Market research consulting firms are reacting by acquiring EFM technology vendors. The Empathica / Mindshare and Maritz / Allegiance mergers look like this kind of combination to me.

I'm not sure how this will play out. Consulting firms typically purport to be "tool agnostic". It may be that these particular companies have vested interests in the survival of Mindshare and Allegiance, having brought those products to customers as part of solutions. If that were the case, it argues against the standalone market effectiveness of those products going forward. Alternatively, it could be that the consultants find a strategic advantage to owning their own EFM products. Potentially they've even acquired significant projects or new customers as a result. From the outside there's really no way to tell.

To me though, when I see technology solutions that require large amounts of consulting support to enable success, the benefits of the solution need to be correspondingly large, usually this means enterprise scale. Ultimately, I think that means the enterprise software guys will be sniffing around at EFM companies too.

"Thought #2.  Serving the small and mid-tier customer segments normally requires that companies have either scale or distribution in order to effectively market.  SurveyMonkey has scale, QuestBack has distribution (I'm a QB reseller), Vovici has partnerships (Oracle in particular) and ConfirmIt now has scale at the high end of the market."  

Since publishing my earlier comments there've been changes at these companies. SurveyMonkey, QuestBack and ConfirmIt have all become more potent competitors via organic growth and acquistions. What's really new is that Qualtrics and Satmetrix have joined the ranks of larger players in the market. The common theme among all these firms is that they now all have scale, breadth of market access and technology competency. It's a good bet that all of them will continue to grow. Some may become targets themselves for the software giants out there looking to acquire saas businesses in the EFM / CXM space.

Thought #3.  Many of the smaller players in the EFM market have neither scale nor distribution. They have to rely on organic growth while competing with larger, better funded and more effectively structured companies.  It is among this group of companies that I expect to see further consolidations.  Some companies on the list:  Qualtrics, Medallia, Satmetrix, Allegiance, SNAP surveys and KeySurveys.  These firms serve Enterprise customers or mid-tier customers using a direct sales or assisted direct sales model. They could all use more scale, more distribution or both.

Not much new here except that the list is smaller. Some companies have moved up and others have been acquired. The rest, in my opinion, will be bought out or will just continue muddling along.

So what's next for EFM?

I see EFM being subsumed into the Customer Experience Management (CXM) space. Businesses today realize that feedback comes to them via multiple channels and in different forms. CXM has been about finding ways to collect and integrate that largely unstructured feedback with standard and more structured feedback processes. The goal being to produce a coherent view of customer perceptions, issues, advocacy and angst as it relates to products and processes. The point is that EFM / CXM's value proposition has moved largely to back-end analyses and visualizations which document insights and facilitate actions that improve customer experience.






Friday, August 1, 2014

Text Analysis with almost No Manual Effort

Last week I posted what was, essentially, a critique of manual word cloud based topic mapping processes. These types of manual processes are very common among survey tool vendors who are trying to incorporate text analysis into their solutions. The critique I made was based on the notion that doing lots of manual work on word clouds and topic maps is unnecessary.  In this week's post, I'm going to try and show a better way, using Etuma360.

All text analysis has to start with a stream of text. The text can come from web forms, chat logs, forums or Surveys. Using Etuma360, that text develops a word cloud that represents the usage of words in the text stream. Rather than manually mapping words topics, what if the text analysis system already possessed a topic database that words in the word cloud were mapped to?  This would eliminate almost all the manual effort involved in word mapping to topics. Exactly how the Etuma360 product works.

A text stream is fed into Etuma360 using either a file upload or an API process.  In this case from a customer survey uploaded to Etuma360.  A word cloud is automatically generated.....


And a topic / sentiment list is also Automatically generated.  No Manual Effort required.....


Because Etuma360 does the topic mapping automatically, the analyst can focus on the visualizations his end users need......

As I said, Easy, Fast and Effective.  Consider how much money a survey analyst makes and how much time they have to spend mapping words to topics.  If its 100 hours in a year and $100 / hour.  Cost is $10K. And, that's before buying software or generating any useful analyses. Etuma360, saves you all that labor cost and more.

To take a free trial of Etuma360 Click the link below:

 https://response.questback.com/demo_nash_stewart_qbbostonusa/etuma90daytrial/





Saturday, July 26, 2014

Analyzing Survey Open Ended Questions

I've been on the distribution list for Survey Magazine (electronic version) for several years now. Recently, they published an article (I think sponsored by Cvent) about using text analysis on survey open ended questions. Though the article offers some useful advice, it seems to me that there are better ways to do text analysis on survey open ended questions than the methodology proposed in the article.  So, I thought I'd discuss some of the main points made and offer some thoughts. For those interested, the url below will bring you to the entire article. For clarity, I've italicized material from the article and colorized, in blue text, my remarks.

http://viewer.epaperflip.com/Viewer.aspx?docid=b3b90dd4-b2bd-4945-8af1-a36100b7c43e#?page=36


The Survey Magazine article begins with the basics about why its important to do text analysis. It goes on to make two points: First, that analyzing text is difficult and why ("The hardest part about including open-ended questions in your survey is analyzing responses. Unlike close-ended questions, it’s doubtful that any two open-ended responses will be exactly the same."); Second, it states that a text analysis plan is necessary. The article goes on to share how to create such a plan.

I would agree that analyzing text is difficult, especially using a manual inspection method coupled with word or phrase searches. I also agree that doing text analysis offers lots of value and insight in the right scenarios. But, in my opinion the methodology proposed in the article is "the hard way" to do text analysis and even so will really only be useful on shorter ad-hoc surveys.

The article outlines five steps in a proposed text analysis process.  

1. Use word cloud technology. The best way to begin your text analysis is by using word cloud technology. The technology sifts through responses and creates a visual representation of the most frequently used words or phrases. The larger the font of the word in your cloud, the more relevant it is to your data. Once you've seen which words pop up the most, you can start to make categories to group responses and analyze trends.

Almost all verbatim analysis technology uses word clouds in one way or another. More sophisticated products combine words or phrases with usage context. For instance, in Retail e-commerce situations words like "Web site" and "Click" show up all the time. But, analyzing them is valueless without more context.  Manually ascribing that context is a major endeavor if any real response volume is involved. So, the process outlined is very manual and ultimately subjective to the analyst mapping the word or phrase to a category. Another analyst at another time might choose to map the same data to a different category, based on their own interpretation at the time (so there's multiple dimensions of subjectivity). Word clouds are useful tools but are not the "end all" to text analysis.

2. Establish categories. The next step in analyzing your open-ended responses is creating categories. Use your word cloud for insights into the range of thoughts and feelings articulated by your respondents. For example, if you asked customers how they think your organization can improve its product, and the words “cost”, “size”, and “color” loom the largest, create categories for those words. Once you begin to read your responses file them under the appropriate categories. If any of your responses fit more than one category, put them in both.

This is largely good advice. But, again its lot of manual work that would have to be repeated on a survey by survey basis. Several commercial text analysis systems I am aware of (including Etuma360) will do this kind of work automatically and then let you tweak the topic analysis produced, saving boatloads of time. And again, building subjective categories can be potentially problematic, for reasons discussed previously.

3. Review and refine As you begin to inspect responses more closely, you will probably find that you have to make adjustments to your categories. If responses used similar words to describe conflicting sentiments, you’ll have to create new categories; if the reverse is true, you can combine categories.

If you've implemented a manually constructed and word cloud based approach, this is good advice. Language is a living, dynamic construct. Interpreting it is always a "tweaking" process. Just, there's a better way to do it than the process proposed. At Etuma, we use a set of layered ontologies to map language meanings to our topic database. Effectively, this lets us use input from hundreds of our users to improve everyone's language interpretation, largely eliminating the need for each customer to always manage that process. Other text analysis tool vendors employ statistical mapping models that they tweak for individual scenarios and customers. Point is, "topics" found in text streams should be largely auto identifiable, especially in known contexts like customer service or e-commerce. 

4. Make correlations. Now it’s time to examine the text within the framework of your overall survey. Start to couple open-ended responses with corresponding close-ended responses to draw conclusions about why respondents gave the answers that they did. If you used an open-ended question as an avenue for respondents to give an “other” answer to a multiple choice question, try and determine if there is a clear winner. You should also cross tabulate your data by demographic to see if any patterns emerge. Find out if certain groups within your sample tended to answer open-ended questions in the same way.

Clearly, this is something that should be done.  And again, in my opinion there are better ways to do it than that proposed. At Etuma, we simply connect the entire survey (and background data set) by api or upload process and automatically connect topics with filtered data subsets based on survey response categories. It's a lot less work and the analyses produced simply auto update over time.  

5. Summarize your results. After you analyze your results, summarize your findings and include any quotes from the text that were especially illustrative of your conclusions.

Of course, summation should be done, but for on-going surveys it has to be done regularly, as the topics people talk about should change over time. 

--------------------------------------------------------------------------------------------
Lots of web survey platforms are implementing word cloud based text analysis. As someone who's used text analysis tools (Etuma360 primarily) for a couple of years now, I find that it is of limited value unless certain criteria are met. Some of those are:
  • Larger surveys with lots of open ended responses  
  • Permanence. Surveys that run over long periods of time are better suited for coupled text analysis than surveys that are ad-hoc
  • Lots of background data about survey respondents
The article in Survey Magazine outlines some useful advice. In my opinion it is more useful for smaller, research oriented types of surveys (ad-hoc with a few hundred responses). For larger long running operational surveys, the text analysis approach outlined in Survey Magazine will be a lot of work. Using something like Etuma360 (www.etuma.com) for text analysis on larger and on-going surveys makes a lot more sense.

To try Etuma360 click here:


Wednesday, July 16, 2014

A cool new QuestBack Video - check it out.


If you follow this blog at all, you know that I represent QuestBack AS. QuestBack has some really good products but is far from a household name here in the U.S.  And, every now and again they do something really "cool" from a marketing perspective too.  The video below is one such example.  Click Here and check it out.

Stewart Nash
LinkedIN: www.linkedin.com/in/stewartnash

Saturday, June 21, 2014

Etuma "Dashboards" Make Verbatims Actionable



Easy to build and easy to share topic based reporting.

I work with Etuma Ltd. and their fine verbatim analysis solution - Etuma360. Over the last year or so Etuma has been building out its "Dashboard" tool.  Each enhancement to the dashboard tool has improved capabilities, usefulness and value-added.   
Etuma
With the Etuma360 dashboard I can now look at a broad topic - Let's say Detractor negative comments. Inside that data set there will be additional topics (maybe shipping damage, late arrival of goods, poor service quality, or any number of other potential issues). Though at a high level this great information. In order for someone to start acting on it, it needs additional further analysis. Etuma's dashboard tool makes this easy.  

For Example:  Etuma works with several retailers, both of the "bricks and morter" and internet varieties. In retail, damaged goods tends to be something that effects both customer perceptions as well as margins and profits. So retailers want to know about, and pay attention to, feedback regarding damaged goods their customers receive. Customers often will provide this kind of feedback in surveys or in contact forms on customer service websites. Using Etuma360 a business can automatically monitor the feedback coming through these channels. For verbatim feedback on topics relating to damage, they can then easily create a dashboard that puts all comments referencing topics related to "damage" into one report.  That report can then easily be shared with the relevant managers so they can see specific feedback (including things like order number or customer number) that would allow them to assess shippers, packaging or other aspects of the logistics chain that impact damage to shipped goods.

This kind of topic aggregation is very useful for creating actionable data from big volumes of feedback.  With Etuma's new dashboard report sharing capability, getting topic dashboards built and distributed to operational managers is both simple and easy. And, because the topic dashboards tend to be very specific, even large amounts of verbatim feedback tend to filter down into manageable "bites" of topic based feedback, making "actioning" that feedback much, much easier.  

Etuma360's built-in topic database allows even the newest customer to quickly start filtering and aggregating relevant feedback. Etuma's price model let's even smaller businesses get started.

Stewart Nash - stewart@etuma.com
LinkedIn: www.linkedin.com/in/stewartnash



Monday, June 16, 2014

Starting an NPS program from scratch - Some Thoughts


I've been working with businesses and non-profits for a number of years now, helping them implement Net Promoter Score (NPS) survey processes.  A conclusion I've come to is that, especially early on in a company's evolution with the NPS methodology, simple is best.  I've found this to be particularly true for businesses who've never tried to measure customer feedback in a structured program before.  If I were advising a new client today on starting up a NPS program from scratch, and who never had a customer survey process in place, I think I'd largely recommend a very simple two question NPS survey that would run for maybe six months or a year as a pilot program.

The purpose of the simple approach is to acquire a broad data set that can be differentiated based on transaction point, relationship type, geography, product, or other variable (or set of variables) that allow the definition of loyalty drivers and of processes that allow the client to affect those loyalty drivers.

The two questions I would want to ask:  #1. How likely would you be to recommend company XYZ to colleagues and friends?

Then, depending on the answer to #1, one of three distinct questions.

#2 for 0-6 scores: "What, in your opinion, could we do better?
#2 for 7s and 8s: "What are the three things you like best, and least, about us?
#2 for 9s and 10s: "Please explain your rating"

Businesses new to NPS typically need to learn what their customers really value and dislike most about them. Customers may value transaction efficiency or personal relationships, they may value cost effectiveness or product characteristics.  Or, they may value something else entirely. Generally businesses have an understanding of what their customers value. But often they fail to understand or (more likely) fail to create action processes that mitigate issues affecting their loyalty drivers.  

Given that NPS surveys can be initially simple, with just two questions delivered to each respondent.  The challenge for businesses is ensuring that they understand who their survey respondents are, and what they're reacting to, for each completed survey response.  Collecting NPS data by source or transaction point is therefore important.  And, in the case of invited surveys it's important that each respondent's "background" data (role, geography, product/service owned, etc.) be captured as part of the survey process.  If the business does this, it learns which transactions or relationships generate the most "likely to recommend" responses, and why or why not (through question #2).  After this data has been studied, a second generation NPS survey can be developed that tracks specific "loyalty drivers" for each transaction type, data source or relationship type.

In 3rd generation NPS survey processes, creating action taking processes based on real-time survey data becomes important.  The reason? Only by acting on feedback does a business mitigate or otherwise impact issues affecting loyalty drivers. This is the real pay-off for NPS processes.  Acting on feedback in ways that mitigate loyalty affecting issues has a long track record of improving business success. 

When starting down the NPS path there are many ways to go astray. Taking a simple approach helps avoid unnecessary resource expenditure, while allowing the development of a plan that ultimately lets the business affect loyalty drivers through closed-loop follow up at the right times and in the right places.  It may take a year or so to go from NPS start up to full NPS implementation, but, its a year worth investing in.  Done correctly it doesn't have to cost a lot to get started either.

Stewart Nash
LinkedIn: www.linkedin.com/in/stewartnash






Tuesday, May 27, 2014

QuestBack v13 Released




As readers of this blog will note, I normally don't shill for QuestBack here, preferring to talk about feedback management challenges and different kinds of solutions. But, I'm going to make an exception today because I believe that QuestBack's Ask&Act product has reached a point where its value proposition has become especially compelling for businesses that want to operationalize their feedback results.

Over the weekend, QuestBack AS released version 13 of its Ask&Act product.  The new version features some really nice enhancements to its on-board reporting function.  For instance, QuestBack reports are now a lot more flexible, both in terms of how data is reported (charts, tables, etc.) and what is included in a given visualization.  QB v13 lets me edit more of the items displayed in a visualization.  And, it let's me take the same data and apply different analyses to that data in different visualizations in the same report.  There's also more ability to add, and edit, text in and around the charts being presented.

This kind of enhanced reporting lets me see the same data side by side in raw form, filtered by time elements, background variables, other questions.  Or, in combinations of all the above.  It let's me view comparisons of time series, filtered views, etc., too.  So, its very flexible in letting me see how my survey data should be presented.

Best of all, as in past versions, QuestBack's reporting allows me to easily share visualizations with whomever I give a url and password to.  All the visualizations I create update themselves in real time as survey data comes in.  People looking at the visualizations using a url are in read only mode, but its a very effective way to easily add survey results to corporate web sites or internal dashboards.

QuestBack's new reporting tools are flexible, powerful and easy to use.  Report results are free for sharing purposes, letting me build and run operational feedback programs that automatically update the visualizations that business process owners need to see.  Once running, there's no need to "babysit" a QuestBack survey.

QB v13 has some other cool features as well.  If you're looking for a lower cost, high value add approach to operationalizing your survey data, QuestBack could be very useful.

Tuesday, May 6, 2014

VOC at Art.com - Using Etuma360 insights to be better and sell more


Followers of this blog will know that I represent Etuma, Ltd. here in the US and that I sell and support the product.  I've been working for over a year with Art.com on an evaluation and eventual acquisition the Etuma360 product.  Following is a little bit about Art.com, their implementation / usage of Etuma360 and some of the benefits they receive by its implementation. Thanks very much to Leanne Onstott - Senior Director Customer Sales and Service at Art.com for the kind words and testimonial.
                                  -------------------------------------------------------------

A little about Art.com

Established in 1998, Art.com has helped 10 million customers in over 120 countries. Art.com is the world’s largest online retailer of posters, prints, and framed art.  They offer the world's largest edited selection of wall art products with over 700,000 images including posters, art prints, tapestries, photography, wall signs, limited editions, hand-painted originals, and exclusive products, as well as a variety of high-quality finishing services including custom framing, wood mounting, and canvas transfers.  Art.com, Inc. runs five sites in the USA and has a strong international presence with 25 local sites in Europe, Japan, Canada, Australia, Mexico and South America.
                                 --------------------------------------------------------------

Why Etuma360?

In the very competitive world of internet retailing, pricing, inventory, service and customer experience really matter.  Knowing the importance of customer feedback for understanding how they are doing, Art.com collects a lot of feedback from a number of sources.  They needed an effective, and cost effective, way to interpret their feedback.  And, because feedback is largely text based in several languages, they needed a multi-lingual text analytics solution.  They found Etuma360 from Helsinki based Etuma, Ltd.

Art.com currently uses Etuma360 in three main applications: Monitoring the global customer experience, Improving sales agent performance and Monitoring their Beta Test website.  “We were capturing customer feedback from various sources and in multiple languages”, says Leanne Onstott - Senior Director Customer Sales and Service.  “We needed a way to accumulate and understand what our customers found as barriers to happiness and repeat purchase with our website and processes.  Using Etuma360 has allowed us to have a centralized view of our customer pain points.  This allows us to prioritize work efforts around addressing key issues.

Improving Sales Agent Performance:

“We also needed a way to analyze our sales agent’s chat dialogues with customers to understand how to improve their ability to convert sales during chats.  Etuma360 has helped us understand what our best agents chat about and helps us give input to our agents about how to be better in their chats with customers.  Since customers in chats may have experienced issues with our self service processes, Etuma360’s chat analyses have been a key to helping us understand what was confusing to them on the website and to help us reduce those self-serve barriers to sales conversion.”

Website Beta Testing:

“Recently, we launched a beta version for one of our websites where we used Etuma360 to isolate chat feedback from beta site customers to learn what changes we needed to make to the website's look and feel in order to prevent customer attrition and limit backlash.

The really cool factor for Etuma is its ability to customize reporting and drill down on key customer topics. It's a big plus for a small company with a global footprint to have the machine translation capability to really hear and understand the voice of the customer."

                                               ---------------------------------------

Being in the business of helping organizations collect, analyze and act on customer feedback, my experience tells me that it's a rare circumstance where a business can receive the kind of cost effective solution that Etuma has been for Art.com.  Getting that solution to work required some effort from a few key Art.com people and a good deal of education about Etuma360 and its capabilities.  But, the payoff has been that Art.com has achieved essentially a do-it-yourself (DIY), highly functional text analysis capability that helps them build a better business by using customer feedback.  All done without any large cash outlays.  This proves that the DIY model can work in the realm of text analytics.  

Stewart Nash
stewart@etuma.com
LinkedIn: www.linkedin.com/in/stewartnash

Monday, March 10, 2014

Etuma - Lowering Costs for Advanced Text Analysis

A quick Monday morning note:

Anyone who has looked into using text analysis technology very quickly realizes that the products available on the market require substantial investment.  But, this is beginning to change. 

Technology investment costs occur along three paths.  Product license costs, Solution implementation costs and Internal time and effort to support on-going use of the technology.  For Text Analysis, typically the benefits have had to be in the millions of dollars per year to justify the solution costs.  As a result, like most technologies, bigger firms with bigger needs, and bigger potential benefit pools, are the ones to adopt first.  So far, market development for text analysis tools is following this model. 

Over time, vendors look to offer their products to larger markets by offering packaging that fits smaller benefit pools that are more broadly applicable to larger groups of potential customers.  Again, text analysis is following this market development model. 

Finland based Etuma Oy (www.etuma.com) has found a way to package its text analysis service so that it is easily affordable for almost any company seeking to better understand its free form text feedback.  They have priced a new package of Etuma360 at roughly $400.00 per month.  The package allows one stream of text data of up to 25,000 text items per month.  It's based on a single named user but includes their "Research" and "Dashboard" analysis tools.  Data sources can be customer surveys, facebook feeds, 3rd party websites or most anything else.  But, for $5K / year a company can process a lot of text and get some very valuable analyses done on it.  And, can do it simultaneously in 10 european languages.

Etuma's new offering should be perfect for companies that want to interpret free form text based customer feedback coming through a dedicated channel like a customer survey or a website based form.  This kind of pricing model has the potential to really broaden the market for text analysis by taking something that was an enterprise type of decision and moving down to a departmental level decision in many companies.

Etuma will be making a product announcement fairly soon, I would guess.  But, to me, this kind of pricing model makes an enormous amount of sense because many applications of text analysis are at the department level. 

More to come on this......