The VP of a Fortune 100 company recently asked for some advice. They were heading into their peak selling season that would last only three months max, and they only had an ability to measure revenue from the website, nothing else. They did not have any web analytics tool.
Her question was: Which analytics tool do you recommend because we want to improve our website and increase sales.
My answer: Don’t implement a web analytics tool, implement a short website survey that would have just three questions.
Surprised?
There were a number of factors behind my recommendation but one of the main ones was that if you want to move really really fast and you don’t know anything then it is better to ask the customer what you should do rather than implement a tool and try to figure it out based on clicks. You will get better insights, faster than you can imagine.
Web analytics is awesome (you do expect me to say that don’t you! : )), is has to be a critical part of your web strategy because it can yield great insights. But for the fastest way to understanding customer problems there is nothing like asking the customer herself / himself (and yes it will lead to improved revenues).
Given the time crunch going to the customer made the most sense, implementation of the Web Analytics tool would happen and after a month or so and slowly over time ClickStream data would play a increasingly important part in decision making. In due course it would achieve parity with the Qualitative data (but hopefully never overshadow it).
Ok so survey got the nod, what questions to ask?
[UPDATE: Qualroo is an affordable survey tool to create the survey below, it also has a user-friendly survey invitation model. Another option is Google Consumer Surveys, it has a free version with the questions below.]
Here are the three questions no survey can live without (and often a survey can be pretty awesome with just three questions):
What is the purpose of your visit to our website today?
It can also be framed as "what is the reason for your visit today" or "what task are you looking to accomplish on our website today" or "why are you here today".
Few website owners have a good understanding of why people visit their websites and this is one of the best possible ways for you to find out that critical piece of information. Answers that you will read, the distributions you will get between different primary purposes, will be eye opening for you, and they will help explain so much of the "weirdness" you see in your ClickStream data (and yes even your path analysis).
Were you able to complete your task today?
If you like conversion rate and revenue then you are going to love this one. An extremely simple question that asks the survey takers to self report their own perception of your website's effectiveness in helping them complete their tasks.
With this question we don't have to rely on our hypothesis such as if the visitors saw this page then they might have gotten their question answered or if I am measuring conversion then I can understand how effective my site is or our site is doing great because we just launched a massive quarter of a million US dollars redesign. We have the customers voice telling us exactly how well the website is performing when it comes to delivering the goods.
Now in case you want to know exactly what you need to do to improve the numbers in the graph, you'd ask a third question:
If you were not able to complete your task today, why not?
It can also be framed as "If you were able to not complete your task please explain" or "Why were you not able to complete your task on our website today" or simply as "How can we improve our website to ensure you are able to complete your task", etc.
The answer to this question is open text VOC, Voice of Customer. It is optimal to refrain from making this a drop down with choices like: Improve internal search, Update the navigation, and Provide more product information etc etc. Let the customers talk, give them a chance to tell you in their own voice the reasons and provide you with suggestions. It works better than you guessing what the answers might be and suggesting those.
Analysis for this question is done by categorizing the responses into common themes and then rating the % of times each theme is occurring in the open ended VOC for those who are not able to complete their task. This is your simple and direct to-do list of issues directly from the horse's mouth about what you should work on in order to improve your website experience for your customers.
These three simple questions that will be the source of a wealth of insights when it comes to helping your deliver on your customer centric strategy. [Want more: Got Surveys? Recommendations from the Trenches.]
ClickStream data is often missing the context, in that absence we overlay our own opinions / experiences / perspectives to make sense of it all. But with answers to these simple questions you’ll have the context to make sense of it all.
What do you all think? Have you tried surveys? Do you have a golden question or two that you use? Please share your own tips and feedback on my question suggestions above via comments.
PS:
Couple other related posts you might find interesting:
Nice post Avinash. I especially like this question, as it reminds me of something else:
'Were you able to complete your task today?'
I often like to compare internet business to offline business. We have this big supermarket here in Germany (Kaufland) and every time you buy something you're asked something along the lines of 'Was everything alright?'. They started doing this a few months ago…
I'm really wondering, if they ever get any constructive feedback from such questions in the offline world. For example, I would never say: I didn't appreciate the fact, that I had to go upstairs to the third floor in order to find the dairy products & then go back down to the first floor, because that's where the beverage is (bad example, but you get the drift, right?).
I think being anonymous on the web helps, because visitors can give whatever feedback they have on their mind, without possibly coming off as unfriendly or odd.
On the internet, forms and collecting user feedback (if the surveys are short!) seem to be really great tools: Somebody not familiar with the web might not expect to get much positive feedback from that (in case it's not working well offline?), but I'm often tempted to give constructive feedback when I see a form with a question, that I know an answer, too.
Don't mean to overanalyze this, but I think it's similar to the need we have to socialize (that other marketing techniques thrive on): If we think we have a good idea, some of us want to tell the webmaster. Why not do it after all?;-)
Reminds me of some web design or usability book, I read, in which the author wrote something like 'You wouldn't believe it, but people actually DO use those forms' ;-).
I just realized, that I shouldn't have narrowed it down to this one question.
And I should have put 'beverages are', I guess ;-(
Great Point, Avinash:
One of my greatest challenges is managing these overlays when I publish reports. The differing views of the same data based on personal experiences causes debates that get pretty heated at times. Because of the broad range of potential interpretations I am very careful when using click-stream. I agree that a couple of usability questions (especially like the ones you cite) really help to cut through the ambiguity.
Great post Avinash. My only observation would be that many people don't complete these surveys. And without web analytics, the site owner's wouldn't know which percentage of visitors chose to complete the survey (i.e. do the opinions represent 'just' the few vocal fans or are they are real cross-section of your visitors…) Maybe combine the survey with a really simple analysis of survey completion vs. abandonment?
Looking forward to your book.
Chris
Chris: I would encourage you to experiment and give it a try, you'll be surprised. I was.
This is an actual example: Our internal company audience was skeptical but when we launched a 20 question survey (!!) we routinely got response rates of between 6 and 24%. That compared to internet standard survey response rate of one percent.
Two interesting lessons learned: 1) People love to talk if you give them a chance, they are after all your customers. 2) You do have to time the presentation of the survey right – experiment with the options of pop up or pop under or on exit etc.
On a decent sized website you only need around 300 responses to get statistically significant results and around 1,000 if you want to slice can dice the data to measure micro segments of your questions / customers.
Watching the response and abandonment rate is, as you say, critical.
Thanks so much for sharing your comments.
-Avinash.
I would suggest integrating the survey data with your clickstream data. Being able to group responses and segment users with certain reponses gives you valuable insight to how they interacted with the site. Using a really good segmentation engine or reprocessing engine you can see how the interactions (behavior) affected the qualitative data.
CSI investigators look at all evidence, so they know the how, what and where. They also interview those that may have information about the crime to understand the why. You need both for the full picture.
Avinash,
While you are 100% correct that voice of customer is a great compliment to traditional analytics, it is not just about a "survey". Without a accurate, precise and reliable measurement of voice of customer, the data we gather can actually do more harm then good. It is not only about getting voice of customer, but getting it at the right time, in the right manner.
To your 3 question approach, they are clearly 3 important questions, but not enough! Primary purpose of visit, completion of task and why not able to complete are all important. But we are missing a very important thing. If someone comes to your site today, and after a gruesome experience, is finally able to complete the task they set out to (let's say for example, download an update to the software), simply looking at those three questions will give you a false sense of security, that all is good.
If they were very dissatisfied with the experience, and as a result are likely to say negative things about your site/company, and not purchase from you in the future, you will have missed all of that with the 3 question survey.
Don't fall into the trap that there is only one (or three) things you need to know. Our customers are very complex and we cannot simply understand them with one or three questions.
Ultimately, the one concept that has been proven over the years to predict financial performance, is satisfaction. But we cannot simply ask that one question either. We need a system of attributes to measure that include your magic 3, but also what drives satisfaction, overall satisfaction and what is likely to happen as a result of the experience.
-Larry Freed
http://www.ForeSeeResults.com
http://www.FREEDyourmind.com
Larry my friend you are right of course. The implication in my post is not so much that you should only ask three questions but that these are three you should always ask because, IMHO, they will be chock full of actionable insights.
I have loads of experience with ForeSee, your tool, and think very well of it. ForeSee asks more than three questions (as does iPerceptions). I encourage experimentation to figure out the right number because there is no correct "off the shelf" answer that one should ask 28 questions. Each business will find the right number of valuable questions for their unique circumstances.
Makes sense?
Thanks so much for sharing your valuable perspective, it is always welcome.
-Avinash.
I sort of agree with what Larry says, but I think you could easily put this fourth question in the third question by changing that question to "or what else did you find annoying?" or something along those lines (or even add a fourth question).
I don't really think Avinash – who seems to be extremely intrigued by the (growing) complexity of the web, user behavior and web analytics ;-) -believes all it takes is looking at those three questions. The beginning of this blog post makes it sound more as if he was trying to provide a quick way to improve the website and thought those 3 questions would be the most effective solution for that 'quick fix'.
However, I'm wondering why you would deter a fortune100 company from using web analytics, if they are doing nothing whatsoever already? Does this have to do with the short amount of time the company apparently had in order to improve their website for their peak selling season?!
Avinash hadn't posted when I began writing my post and the last paragraph was for avinash, and not directed at you Larry, btw, just so there'll be no misunderstanding :-)
I think the more questions you can ask the better probably, but I guess it all comes down to the amount of traffic you get..because the more questions you ask the less likely visitors will be to reply to them..and you need a big enough sample for your survey data to be statistically significant/insightful in general, I guess…
oh but then again, if you ask too many questions (and too complex ones) your data might get skewed, too, as some survey takers might want to just get it over with and thus not think as profoundly about the questions & their replies…so you probably need to find a balance
I agree with your approach, but would either add or replace a question with the below question or two:
Would you recommend this website to a friend? If no, why not?
Keep up the good work!
ps – I certainly agree with your approach of going beyond just usability (and even optimization) and just ask your customers, too.
Hi Avinash,
Great post! I agree with this immensely. Mostly because, this is the quickest way to make the right improvements to the site w/ such a short peak season. Of course, an analytics/clickstream tool is always necessary however, if I had only 3 months to work with, I'd opt for the surveys just because, you can go down so many ratholes trying to infer customer intent with a analytics/clickstream tool. When a customer tells you, I came to download and your download process sucks, there's not much to have to infer from that. You go fix your download process.
How timely this article is for me.
I'm debating between hiring a analytics consultant or running a survey to get a better idea why we are losing customers from when they click on the BUY button to actually finishing the check out.
Of course one issue is that our site is brand new so that may have something to do with this issue but I'm concerned that requiring registration prior to check out is depressing orders.
Unfortunately I'm not nearly knowledgeable about using Google Analytics to fully understand the reports (I need to spend time digging deep into Avinash's blog to get smarter :)
Anyway, I think we will start with the survey and try to intercept browsers as they leave the site. For smaller site I found SurveyMonkey.com to be a great tool to easily create surveys and they give you great tools to analyze the results
Avinash,
The spirit of your questions is spot-on. The implementation is what I would quibble with. Having been involved with many surveys that have attempted to ask those questions (or some variation), I can tell you that the term "task" can sometimes be confusing for the average person.
What was my "task" when I came to your site this morning? I can't answer that question. I didn't think of what I was doing as a "task". Ok, so did I have an "objective"? Nah, that doesn't capture it either.
I don't think there's any easy solution here. As I've posted on my site, I think the objective that we as marketers are trying to achieve by asking the questions is what the site visitor's expectations were, and whether or not those expectations were met. But I wouldn't use the word "expectations" in the survey question either.
— Ron
Avinash,
This was a great post that we are actually implementing today. We are an online survey software company and we know that our site is not set up well on the home page. Our software is great and as soon as someone tests it out they love it. The problem is getting them to test it.
We are going to practice what we preach start surveying our prospects and clients.
Thanks for the great idea.
Ryan
I love your cut-to-the-heart-of-the-matter attitude, Avinash. Details aside, your three golden questions are essentially what I ask when I do customer interviews about product needs.
I don't ask "what improvements would you like us to make to the product?" I ask "what are you trying to accomplish and what obstacles do you face?"
This gets directly at what you need to know to improve what you provide to the customer. And it leaves the door open wide enough to any kind of issues the customer may be having that you are sure to learn some things.
My bias has always been to start with qualitative research tools like this and then, once you've categorized the common issues, start quantifying them and measuring the effects of different strategies on them. That's where web analytics would come in.
You've convinced Lynda above. She says, "we will start with the survey and try to intercept browsers as they leave the site." That's the right first step.
Yesterday a client asked me if I could cite any examples of a survey leading to an uplift in trade.
I've got plenty of evidence of customer surveys bringing in real gems of information about obscure glitches which we might otherwise have missed.
For example an e-commerce site with double digit conversion rate. So the site's working very well by many standards. But we can still make it better…
Customers are very loyal and around 50% fill in a survey (this is a confirmation page one, not the more interesting general one suggested by Avinash) and around 16% then add comments in the VOC box. Those comments produce real gold. For example: snags with routes to the edit credit-card-details system (not the system itself) are hard to spot in the clickstream. Without knowing the intent of such users we would be unlikely to segment for them and they would be lost within the general noise of the checkout funnel. A usability study would find it, but only if something prompted us to test for returning customers with expired cards — a fairly small group unless your site has very loyal customers.
So there's an example which will produce an uplift in trade over time. But I think this client was asking about something even harder to quantify. Does the mere fact of asking for the information demonstrate a willingness to engage in dialogue with the customer in itself and cause an improvement? That would be very hard to measure, so I wonder if anyone here has any thoughts on the subject, please.
In a bricks and mortar store we tend to shy away from salespeople who ask "Can I help you?" — at least in the UK where the automatic response is "No, I'm just looking." But for the moment the effect would probably be the other way round on line. A site which showed some form of interest in the visitor might turn out to be more interesting to the visitor. I'd love to find some evidence of that…
Tim
My question, having done one of these on our site, is – now what? I've uncovered some surprises and discovered some strengths but what can I say to management about what to do next?
25% of visitors report failing because of a specific problem. If I fix that what should I expect?
Great things, no doubt:-) But is there any way to guess how great?
Hi Avinash. I'm interested in online survey capability. In terms of selecting a vendor for this, what are the important capabilities you should look for?
I assume that you need to be able to specify when and where the survey would be asked, and that this would need to be done as folks were leaving the task you want to measure? Is that right, and is there anything else that you feel is important?
Tsk, tsk. Too many people speaking out without any REAL evidence or experience with any of this stuff. Bottom line: Web Metrics by themselves are useless — if you don't know 'why' you know very little. Tools like iPerceptions are critical for the 'why'. That said, I have never recommended the purchase of iPerceptions to a client yet (even though it is one of the best tools in the market) because they don't tie their data to the transactions (but allow you to do so).
Since many clients have nothing at all (and typically are not ecommerce focused), I leverage Usability Sciences' WebIQ, which gives you both the transactional and the qualitative data [no confustion…this is not a transactional toolset — it is not focused on transactions in total].
At Texas Instruments, where I gained the greatest hands on exposure to a breadth of tools, I relied on data both from HBX and from WebIQ.
I like this for its simplicity. Too many times analytics experts will be tempted to ask a lot of questions, not realizing that the more trouble they cause the participant, the more likely they are to get bad or incomplete data. I'd much rather have three questions that are answered 100% truthfully and 90% completely than have 25 questions answered 50% truthfully and 20% completely.
Hi there,
I think that this does not make any sense, what is the idea of asking those three questions?
Using your web analytics tool , you can easily answer the three questions by yourself easily. in addition you won't spend your precious time going through a lot of spammy answers.
I think that those three questions enter into the game powerfully when it comes to offline marketing.
Thanks
arabian4ever : The problem with all the web analytics data is that it can only tell you what it knows, it can't tell you what it does not know.
You bring an interesting perspective, but I believe that data, in this case, can be deceptive in the insights it can provide.
Here is a example. I come to your site, I am looking for a job (or support or a product I saw in a store or ….) and you don't have it on your site. How would the web analytics tool tell you that? The survey would.
Or I am on your website and it stinks at major things. How would you know? You can look at top exit pages. You can look at segmented content or visitors. In the end you will have to rely on your, potentially limited, point of view (and experience in life) to guess what stinks. If you use a short survey you can let the customers tell you.
In the end here is a good way to think about it. Web Analytics is good at the What. It is not good at the Why. The Why can only come from the customer.
Here's a post that might be interesting:
Overview & Importance of Qualitative Metrics
Hope this helps,
-Avinash.
But could we expect the customers to answer these quiestions…I doubt , we could have such patient customers. Further, at times a customer could end up buying but by the time he/she has bought something, he would be frustrated by the difficulty he had locating what he wanted (from my own experience) that he would decide to rather buy it from an other site next time on. In such case does this simple question make sense
" I find shopping here…. " and then have a few options from Easy to Hard , or Rapid to time consuming.
In simple terms that could give us an idea about our User Navigation, coz its the customer who knows how user friendly the navigation is.
Nilesh : I encourage experimentation over entrenched opinions. You never know what you don't know.
For example:
My first survey on a large ecommerce site had 24 questions (all on one page in a pop up). I was positive it would stink – and I said so to the survey vendor.
The response rate (completed forms) was 18%.
I was astounded. And embarrassed because I was overlaying my own opinions on what the customer might do.
Of course since then we have fewer questions, much better surveys (ajax and everything!), greatly improved targeting systems etc. Which each the response becomes better and more importantly our ability to listen has dramatically improved.
Moral of the story: 1) Experiment, it can't hurt that bad and you might just be proven wrong (or right and then you'll know for sure). 2) If you allow a mechanism to your customers to speak, they like to talk.
Hope this helps.
-Avinash.
Avinash,
Where do you recommend placing this 3-question survey, and how prominently? I assume that if you put it at the point of conversion, you will not reach many people who do not even get there, but if you put it on the homepage, you will have people unable to answer questions 2 and 3.
Thoughts?
Hi Avinash,
I`m doing some research about surveys to share with my students. I`ve seen that a common problems with this kind of surveys are that they are often displayed during navigation so you have answers to question 2 like "I don`t know. I haven`t finish my search yet". Is there anyway to avoid this?
Pablo: You are absolutely right, it can be a issue. Hence I recommend a "onexit" survey, it is only shown when someone leaves the site.
We have recently released a free survey that does this now, please check out this post:
https://www.kaushik.net/avinash/2008/03/4q-the-best-online-survey-for-a-website-yours-free.html
Hope this helps.
-Avinash.
Avinash,
Great post, I agree with you; getting "what we don't know" from customers is key to making improvements. This will help our Alternative medicine community site given the breadth of topics we cover.
One question – what's the best practice as to where to place the survey. Options I am thinking of are – 1) Button on top of every page or 2) pop up at the time of exit. Any thoughts or are there off the shelf packages to achieve this.
We went ahead and implemented this survey last week on our staff blog. Can't wait to see how it comes back!
http://multiply.multiply.com/journal/item/299/Survey_Three_Questions
Hey,
I decided to create a Q4 survey for my client and also signed up.I was able to complete my 1st step successfully.Unfortunately I didn't get that"specific link" from Iperception after the 1st step , as you mentioned in your youtube video. Am unable to edit the questionnaire according to the relevance of my clients' website. Can you please help?
Thanks!
Hi Avinash,
Yes this one is a winner for me, will be requesting this of our web developer. My thought now turn to implementation, how do I encourage people to fill out that survey?
Thanks!
Damian: Keep your survey short, experiment with different types of non-intrusive invitation models. I've been playing with a small box that show up on the bottom right of the browser, and only for people who have seen x number of pages.
Both Kissinsights and 4Q have this type of an invitation model.
-Avinash.
I have been doing internet marketing since 2006 and have never used a survey. Maybe I have if I were speaking to the customer on the phone, but not on my actual website.
I think surveys should be an integral part of the online marketing process even if it doesn't directly rank a site higher. In the end the customers opinions are what matters because they are the ones forking over the money for services and product.
I already implemented 4Q, but I am also interested in https://www.kissinsights.com.
I saw this on Avinash's AWESOME Web Analytics Training Course.
I wonder if any of you have tried using something simple like a 4Q survey, but adding an invitation to users at the conclusion of this survey to participate in a longer survey if they'd like to help improve the website? It's something we've been interested in trying out for a while, but haven't had the pressing need for a longer survey to make us take the step.
Anyone doing this? I'd be interested to know what results you're getting.
Tony: It is certainly worth experimenting with. I believe that the paid version of 4Q allows you to do something like that, or at least to have a link on the last page and you can send that link to, say, a deeper survey you've created using Google Docs (free of course :).
Avinash.
Hi Avinash
Although this post is pretty ancient in web terms I hope you can still help me with a bit of a dilemma.
The company I work for owns several websites. We've been trying to improve them over the years using web analytics and usability testing and sometimes by conducting online surveys. Those surveys tend to be really really long though and in the end I usually find only three or four questions really interesting. That's why I want to try something different and conduct an ongoing survey with your three questions.
However, one of my colleagues that is responsible for our financial news website thinks it's absolutely ridiculous to ask those specific three questions on a news website. He thinks the purpose of the visits to his site is clear: read news and that you don't have to ask. I don't really agree with him and furthermore I would just like to try and see what happens but he is adamant that we will make a fool of ourselves if we do so.
Do you have a convincing argument why these three questions are suitable for news websites as well?
Anne: It is very hard to argue with entrenched opinions. All I would say is that if you use something like https://qualaroo.com/ it will take you and your BFF just 5 minutes to create an launch the survey with the three questions.
Why not let real world users and real world data help you decide if this is a good idea or a bad idea?
As to the question of why people go to a news site… I concur that many people will be there to read news. But the beauty of the primary purpose question is that it will help you understand what are all the other reasons people might come to the site for. And if that data is material use it to improve your site. If it is not, in the next iteration of the survey just ask two questions. :)
Avinash.
PS: The only way you might be making a "fool of yourself" might be with you loooooooong survey. I have yet to encounter an instance where a short focused survey ever lead to a "fool of yourself" outcome!
Hi Avinash,
I manage the site satisfaction survey for my company and we did an analysis where we merged the survey and click stream data together then compared the survey population to our overall site population. From this analysis, we saw that the survey respondents are not representative of the overall site population from an engagement, composition of OS, etc, however, they were representative of some of our most engaged and valuable customers (a very active, but small % of the total population).
My manager saw this analysis and is now skeptical of the usefulness of the survey data, even though he acknowledges that the survey respondents are made up of these highly engaged users on our site. Do you have any recommendations on how I can potentially address my managers concerns?
Thanks,
Eric
Eric: There will never be a 100% match between quantitative data and qualitative data. But both have holes in them that make it mandatory that we focus on using the other one as a complement to make smarter decisions.
For your survey, as pointed out in this post, you need to ask both the Primary Purpose and Task Completion Rate questions. If you do, just make sure you have enough responses for all Primary Purposes and you should be all set. Either to make decisions about what to change right away, or what A/B tests to run to continuously improve the site performance.
Your manager, from the limited information here, seems not so much skeptical about qualitative data but rather all analytics data. In this case, you need to figure out how to earn his love/trust. Here are two posts with ideas:
~ Six Rules For Creating A Data Driven Boss!
~ Empowering Analysis Ninjas? 12 Signs To Identify A Data Driven Culture
Good luck!
Avinash.
I just ran across your article, "The Three Greatest Survey Questions Ever". While analytics is indispensable, there is nothing like getting direct customer feedback, and your three question approach is simple and direct.
Do you have a suggestion on a simple tool that can be used to categorize responses?
Richard: I am afraid I have not found a pubic source that I trust to provide sufficient value. Though there are many out there:
https://www.google.com/search?q=analyze+text
Internally the team has a tool they use that is pretty good. Sadly, not public,
Overall, humans are still better at parsing these patterns. Though I recognize if you have 10,000 rows that is beyond human. Then, I use the link above to find something to reduce things enough to have a human take over. :)
Avinash.
Hey Avinash,
I agree with you on best surveys being made with only 3 questions and the ones you have mentioned here in your blog would help website yield tons of data.
Great post Avinash
Just an idea…
Why not just have an intelligent survey that asks questions depending of the last answered one.
So the first question could be about a feeling of your Customer visiting your Website and after that 2 or 3 other questions could "pop".
This way you will add value to the survey and be able to segment super fast .
Daniel: It is a good idea.
Many nice survey providers allow the if-then-else functionality. That way you can have more questions in the hopper, but only serve a certain number to the visitor.
-Avinash.
Hi,
Google Surveys are not free anymore or I was too dumb to find the free version. You may want to correct that in your post.
Anyway, thanks for being awesome and sharing your knowledge!