The title of my presentation at the Washington DC Emetrics summit was: Creating a Data Driven Web Decision Making Culture – Lessons, Tips, Insights from a Practitioner.
My hope was to share tips and insights that might help companies move from just having lots and lots of data to creating cultures where decisions are made not on gut-feel, or the proverbial seat of the pants, but rather based on data.
In this post I hope to share the essence of some of the main ideas communicated in the speech. The format is: words from the slide followed by a short narrative on the core message of the slide. Hope you find it useful.
Quick Summary:
-
# 7 Go for the bottom-line (outcomes)
# 6 Reporting is not Analysis
# 5 Depersonalize decision making
# 4 Proactive insights rather than reactive
# 3 Empower your analysts
# 2 Solve for the Trinity
# 1: Got Process?
# 0 Ownership of web analytics: Business
Seven Steps to Creating a Data Driven Decision Making Culture……..
Slide 1: Decision Making Landscape
State of the Union….
‡ Time to implementation: five minutes
‡ Tools are just that, sadly
‡ Humans love gut (literally in some cases :))
‡ Math is hard
Core Message: The biggest challenge in our current environment is that it is trivial to implement a tool, it takes five minutes. But tools are limiting and can just give us data. What compounds the challenge is that we all have this deep tendency to make decisions that come from who we are influenced from our life experiences.
Based on my humble experience of the last few years here are seven common sense recommendations for creating a data driven company culture……
# 7 Go for the bottom-line (outcomes)
‡ Never start with clickstream, it becomes “old” quickly
‡ People care about their paychecks
‡ Execution strategy:
~ Identify Senior Management hot buttons
~ Exhibit daily that you can
• increase revenue
• trim costs
• improve customer satisfaction
Core Message: The most common mistake in web analytics is to slap a clickstream tool (Omniture, WebTrends, HBX / WebSideStory, CoreMetrics etc) on the website and to start sending reports chock full of clickstream kpi’s out. Great for a couple months and then you lose the audience. Sit down with your core audience and figure out what motivates them, how their personal salary / bonus is paid? Start with measuring these Outcomes metrics (revenue, leads, profit margins, improved product mix, number of new customers etc).
Once your audience figures out that you exist to make them successful (and not spam them with reports) they will be your eternal friends and now you can slowly over time start to help evolve them from Outcomes to some pretty complex clickstream analysis and KPI’s.
# 6 Reporting is not Analysis
‡ 99 % of web analytics challenge:
- ~ Data : petabytes
~ Reports : terabytes
~ Excel : gigabytes
~ PowerPoint : megabytes
~ Insights : bytes
~ One business decision based on actual data: Priceless
‡ Reporting = providing data (time consuming, all subsuming)
‡ Analysis = providing insights (time consuming, all subsuming)
- ~ Your Choice?
‡ Reporting = the art of finding 3 errors in a thousand rows
‡ Analysis = the art of knowing 3 errors in a thousand are irrelevant
- ~ Your Choice?
Bonus: Here is a blog post that outlines in detail the difference between reporting and analysis, and shares ideas you can use every day: Rebel! Refuse Report Requests. Only Answer Business Questions, FTW.
Core Message: There is a lot of confusion between what is reporting and what is analysis. Analysis in our world is hard to do, data data every where and nary a insight any where. Reporting is going into your favorite tool and creating a bizzilon reports in the hope that a report in there will tell you, or your users, will spark action. That is rarely the case.
An additional challenge is that both reporting and analysis can take over your lives, you will have to make a explicit choice as to what you want to spend time on. Remember that if at the end of x hours of work if your table / graph / report is not screaming out the action you need to take then you are doing reporting and not analysis.
# 5 Depersonalize decision making
‡ “HiPPO’s” rule the business world
~ Highest Paid Person's Opinion
‡ It is never about you, it can’t be about you
~ Benchmarking is awesome
~ Leverage competitive analysis
~ Experimentation and testing rocks
‡ Execution strategy:
~ Transparency, standardization, looking outside in
~ Be a slave to customer centricity
• Its about your customers (internal & external)
Core Message: I can’t say it any better, HiPPO’s rule the world, they over rule your data, they impose their opinions on you and your company customers, they think they know best (sometimes they do), their mere presence in a meeting prevents ideas from coming up. The solution to this problem is to depersonalize decision making, simply don’t make it about you or what you think. Go outside, get context from other places. Include external or internal benchmarks in your analysis. Get competitive data (we are at x% of zz metric and our competition is at x+9% of zz metric).
Be incessantly focussed on your company customers and dragging their voice to the table (for example via experimentation and testing or via open ended survey questions). Very few people, HiPPO’s included, can argue with a customer’s voice, the customer afterall is the queen / king! : )
# 4 Proactive insights rather than reactive
‡ “Traditional Web Analytics” = Going “forward” while looking out of the rear view mirror and driving in the reverse!
‡ Get ahead of the train, earn a seat at the strategy table
‡ Execution strategy:
~ Don’t wait for questions to be asked
~ Attend “operational” meetings and session
~ Drag in best practices from outside
~ You can no longer be just a “web analyst”, now its healthy doses of “web smart guy/gal”
~ 20% of your time should be providing analysis no one asked for and only you can perform
Core Message: Web Analytics is “rear view mirror” analysis, by the time you get the data, even in real time, it is already old. This complicates things quite a bit. In order to get ahead don’t wait until someone stops by asking for a report. Get ahead of the game. Attend strategy and operational meetings. Be aware of what the upcoming changes are to the site or your campaigns or acquisition options. Before you are asked have a plan to analyze the impact and proactively present results. You will win kudos and you would, because of who you are, have provided better analysis than what might have been asked for (or worse they might just keep doing stuff and never know if it works).
That last bullet above is very important: If you are a Analyst, and not a report writer, 20% of your time should be devoted to pouring over data and doing analysis that no one asked for but only you can do because you are the only smart one in the family.
# 3 Empower your analysts
‡ Two deadly problems: Tools are restricting and corporations expect predictability
‡ Senior Analyst / Manager Rule: 80% analysis – 20% reporting
‡ Create an environment that encourages risk taking
‡ Execution strategy:
~ If you need reporting hire an intern
~ Hold Analysts accountable for insights, then set them free
~ Critical thinking should not be under-rated
Core Message: Almost every company hires for the position of a Analyst, often Senior Analyst, and then quickly proceeds to convert them into report writers. “Here is our Omniture / WebTrends / HBX tool, here is a list of all our internal data consumers, and here are all the reports that they need.” This is a perfect job for a summer intern (they come with the additional benefit of wanting to work really really hard for no pay). The job of a management team that wants to see a data driven culture is to first empower their analysts. This means giving them the strategic objectives of the website and then get out of the way. Make sure that the work load is the analyst is such that they can spend 80% of their time doing analysis. Hire critical thinkers.
Data driven cultures rarely exist on Reporting. They thrive and prosper on analysis, by one person or by every person in the organization.
# 2 Solve for the Trinity
‡ ClickStream is 33% of the input, on its best day
‡ ClickStream = only the What
Research = adds the Why
Outcomes = the How Much (as in: are you kidding we only made this much? :))
‡ Execution strategy:
~ If you only have clickstream, get the others
~ Integrate clickstream, outcomes, surveys, usability, open text voc
~ Start with How Much, move to What and grow into Why
Core Message: I am sure you are all bored to death hearing me talk about the Trinity strategy (click here if you are not bored). The lesson here is simple, only doing clickstream analysis does not create data driven culture because clickstream data can’t consistently provide deeply impactful analysis. Normal business people have a hard time digesting the amazing limits to which we stretch clickstream data. Bring other sources of data that make for richer and full picture analysis. This will make it much easier to connect with your users and the things that they find valuable and can understand.
Secret sauce: Start with the How Much, evolve to the What Is then strive for the Why (or why not if that is where you find yourself : ).
# 1: Got Process?
‡ Report publishing / emailing schedule is not a process
‡ Web decision making can’t be ad-hoc or just post-facto
‡ Decision making is a journey, not a destination
‡ Execution strategy:
~ Steal / be inspired by Process Excellence, adapt as necessary
~ Identify core web processes, push to identify operations, define success metrics, put decision making process in place
~ Get stake holders to have skin in the game
Core Message: This is perhaps the single biggest differences between cultures that achieve the mythical status of being data driven and those who languish. Process helps create frameworks that people can understand, follow and, most importantly, repeat. Process Excellence (six sigma) can also help guide you and ensure that you are actually focusing on the Critical Few metrics and help establish goals and control limits for your metrics so that it becomes that much easier to stay focussed and execute successfully.
Processes don’t have to be complex scary things. The picture shared was that of a simple powerpoint slide that using a very visual flow illustrated exactly what the process for executing a a/b or multivariate test was, end to end. It showed who is responsible for each step and what deliverables are expected. Very easy to do. But now not just you but everyone knows that to do. At the end of the day it is process that creates culture, do you have structured processes in your company?
One critical bonus recommendation……
# 0 Ownership of web analytics: Business
‡ Think, imagine, move at the pace of business
‡ Ownership close to outcomes, proactive and analytical needs
‡ Successful Web Analytics usually, not always, outside IT
‡ Execution strategy:
~ Identify the website / web strategy owner for your company
~ Consider moving your Analytics function (all of it) over to them
~ Insist on the Analytics function own and drive holistic reporting, analysis and testing strategies
~ Create and measure success metrics for your Analytics team
Core Message: I get asked this question all the time, who should own web analytics. Most companies don’t have a single team that owns web analytics end to end. There is a team in IT responsible for the tag, another team in the PMO responsible for gathering requirements, yet another team, usually fractured all over or in IT, responsible for creating reports and someone else responsible for looking at the data and doing something, or usually nothing.
Web analytics should be owned by a business function, optimally the one that owns the web strategy (not the web site, web strategy). That will align measurement of the success of the strategy very closely with ownership of the strategy. This will also ensure that the team has the air cover it needs, the business has skin in the game and usually, though not always, business teams have a different mindset than IT and can think smart and move fast (this is not to say IT can’t, I have spent four years in IT myself : )).
In summary: Data Driven Organizations……..
* Focus on Customer Centric Outcomes
* Reward analysis and not number of emailed reports
* While measuring success against benchmarks
* Which is achieved by empowering your analysts
* Who solve for the Trinity, not just clickstream
* Using a well defined process
* That is owned and driven by the business function
How is your company doing? Do you have a culture that foster's some or all of the above? Have you observed strategies that work for you? Have you tried some of all of the above and it still did not lead to success? Please share your tips, feedback, success stories via comments.
[Like this post? For more posts like this please click here.]
Hi Avinash,
It was an outstanding presentation! You're on my top 3 of eMetrics Washington 2006 ;-).
I really like the way you think and you make lots of sense as Aurélie said. Liked a lot also you 20% of time to be spent on reports no one requests.
I've learn with time that this is capital. When we were starting WA at OX2 I always argued with Aurélie because I thought she was going to far way very often in comparison with what the client had requested. I see now that this is our strength. We allow ourselves going a little bit further, bringing (or at least we try) Insights to our customers.
Getting back to your call for feedback, I just might have a comment. It's regarding the boss' pay check. When I see how Robbin had interpreted your saying, in her post called The Web: Is it Really about Money?, I guess that you should maybe specify that this advice is when your an analysist within a company. I don't think it will work that well always when you're an external consultant. So maybe just clarifying that issue is my only remark.
Great presentation again and you should therefore receive A+ ;-)
Kind Regards,
René
René: Thanks so much for the feedback, it is exactly the kind of critique I was looking for. At the summit Robbin had also been kind enough to share her feedback on the money element. We had a great conversation and I have some good food for thought on how I can better frame this point in future presentations.
Thanks for taking the time to provide your feedback.
-Avinash.
Your post is excellent as usual but I am a bit confused when you say:
"Time to implementation: five minutes"
Well, we try to convince our clients to spend time (more than five minutes) to plan their tagging.
We advice them to do a "plan de marquage" that can be translated by a tagging map.
Which label to give to a page, in which heading to place a page (not necessary the same as the website), which parameters to insert in their tag code, etc…
This work forces people to think about what they expect from their tool.
It will dramatically influence the "quality of data", of report and of analysis. I think that it can't be neglected.
Benoit
Hi Avinash,
Nice summary. I am now even more bitter that I didn't see your presentation of this (I don't remember what session I was in). I especially appreciate your thoughts on the difference between reporting and analysis, and the importance of trusting your analysts enough to find things that truly scare you about your website. I find that a lot of people think that an analysts job is to find data that supports an action, rather than the other way around. It's a big cart-before-horse problem that your post goes a long way towards addressing.
-Jason
Great post Avinash. Wish I could have made it out to the eMetrics Summitt to see it live. Would have been nice to grab the slide deck as well as your post from the site.
I especially like the HiPPO portion. Interesting that others may have mistook the point as something like "your opinion doesn't matter, you're working for the man". Personally, I've found significant success with highlighting how we're doing compared to others and bringing out customer comments and it now makes up a significant portion of my reporting and has even forced some deeper analysis. It has stopped my HiPPO in his tracks and made him sit up and listen.
– Scott
Benoit: Very good point, tagging a site can be quite a complicated experience based on what web analytics vendor is being used and how much data the website is willing to cough up.
There are vendors that will allow you to have one standard tag on the entire website and they will capture 90% of the data you need instantly. So that implementation can be five minutes (throw the javascript tag in a global website element like a footer, hit save and go get a bottle of wine).
If your vendor can't support a standard tag on the site and needs to have multiple tags on the site or you need to pass a lot of variables via javascript to your web analytics tool then here is the recommendation:
1) Implement a "standard" simple tag on the site as soon as you can.
2) Learn from that implementation what your company really needs and where the gaps are in data capture / web analytics tool.
3) Strategically update the tags on pages as you have to. Get smarter over time.
4) Buy REL Software's Web Link Validator to check that the right pages have the right tags (or if they are missing tags all together). Costs between $95 to $495 and it does a lot more than check for missing tags.
I have to admit I am a bit biased against "let's make a master plan before we go out there and implement perfectly the first time" (and this is not what you are saying of course, I have heard it said by other folks). Since business users ask for what they want and not what they need (until after you give them what they wanted), I think a iterative approach works best.
This is not very different from the overall mindset recommended in the "how to select a web analytics vendor" post.
I hope this helps. Thanks so much for your comment, I appreciate your feedback very much.
-Avinash.
Yes, you're right, Avinash .
An iterative approach can produce better results.
Clients have difficulties to see what's good for them at the first time.
Benoit.
Avinash,
Very good summary of what's important in developing an analytics program. I think that the core of the issue is that reporting is easy and requires very little skill and responsibility for the outcome. Analysis requires you to put your reputation on the line. However, if the data is correct and the evidence is clear, it should be a confident action and recommendation.
Great stuff, keep it coming! I wish I could have attended and seen you deliver this in person, but enjoyed reading summary and how you presented it here.
My takeaways are pretty clear:
If you're an analyst, look for a boss who wants and knows how to use an analyst and will let you do your thing.
If you're a web site or maketing program "owner" with bottomline accountability, look for an analyst who uses data to generate important questions and strategic ideas. Hire someone who has demonstrated that he or she knows how to help you use his work to grow the business and advance your career. Or save your job.
Nice presentation.
Avinash, what have you done in terms of data presentation in the past to best satisfy the various stakeholders' needs. Obviously a CEO wants to see different metrics than a Marketing Manager but what do you think is the most effective way to present the data to both of them?
Manoj: You have asked a very complex question, hard to answer with a pithy reply. I'll try.
Just as you segment data infinitely to find insights so we have to segment and present exactly the targeted data to each stake holder.
My usual recommendation for Senior Management (say VP and higher) is to present only the core metrics that measure what the company strategy is solving for. The dashboard should fit on one page, mostly graphs, and not show more than seven metrics. Each should be measured against a goal or benchmark(remember context is king and you also don't want them to think).
For others my recommendation is to empower them to find the data they need and help themselves. If you do a good enough job with the Sr Management dashboard you can bet they will put pressure on the organization to measure metrics / kpi's that have "line of sight" (relevance) to what is on their dashboard.
Hope this helps (I realize it is probably not as expansive as what you might be looking for, in my speeches I have shown real life "pictures" but they can't go on the blog for obvious reasons).
-Avinash.
Thanks Avinash, I was trying to get a feel for how other analysts have handled this and your answer has helped. (I realize it's tough to answer in a comments section)
Hi Avinash,
Thanks for your very interesting post! It's exactly what I'm thinking about Web Analytics & Reporting!
The only open point which annoys me every day is the question about benchmarks. I'm working for an IT company in Germany and we have a lot of country websites. We already definied several KPIs, but the first question I always hear is: "Is it a positive or negative result?"
How can I answer this question?
What comparisons of which data make sence? I already realized that I even can't compare the results between our different country websites because there are so many influencing factors which make a comparison not useless (e.g. different country means different markets, different prospects, different awareness level of our brand etc.).
I'm also aware that a comparison between our websites and the websites of competitors would raise the same problems/questions.
How are you dealing with the question about benchmarks? How do you evaluate your KPIs. Is it reasonable to define goals for every year, e.g. "in 2007 we'd like to have a conversion rate xy by 20.6%"?
Keep up your good work for this blog – very good and informative!!
Mafalda: In my speech this was covered under step # 5 (Depersonalize decision making). The core thought was that if you want to make headway with numbers and trends then it can't be about you/me, we have to bring benchmarking (internal and external) or bring customer context. This way it is not about us, it is what "someone else" is saying.
As regards to benchmarking I had shared two suggestions:
1) "Internal" Benchmarking: Simply use the data you have to create "benchmarks". So for example never create a 7 day or 12 month trend, always do 8 days or 13 months. The cool thing is you just gave your users a "benchmark" about how you were performing in a earlier comparable time period. This is one small examples.
The other way to do internal benchmarking is what you suggest. Look at trends and patterns over x amount of time or for y customer segments and create your own benchmark. For example for the last year conversion has been 2.4% and now we have three new VP's for Conversion and we are doing SEM so we should not have Conversion of 3.4%. Now 3.4% might just be a way to start the conversation.
If you end up doing better than 3.4 then you dig deeper and if you end up lower then you dig deeper. Either way the analysis has started.
2) External Benchmarking: Get benchmarks from outside. For example we use the ACSI (www.theacsi.org) for measuring Customer Satisfaction. We don't have to say your site is bad, we have a external benchmark that says that. Another example is the last shop.org benchmark for conversion was 2.2% for online retailers, so that is something we can use. Finally a source such as HitWise is a great resource for benchmarking (you can benchmark how much search traffic you get vs your competitors or what is your share of keywords vs others etc etc).
As regards to different countries, I am sure there is something you can start with. I would look for that, even if you can find one metric. Giving all of them one thing that they can all be "benchmarked" against can be a great way to motivate the right behavior.
Here's a suggestion: Run a simple pop-up survey with two questions on all your websites….
1) Why are you visiting our website today? (Open ended answer.)
2) Were you able to complete the task that you came to the site for? (Answers: Yes or No.)
Now benchmark all of them against this. You have a winner on your hands. :)
Hope this helps.
Avinash.
Great comments
I was in charge of INFO management for a large FMCG and they placed limited importance on it . It was like banging your head against a brick wall. IT didn't get it either – too focussed on transaction capture and monitoring and control reports for finance
How can I keep in touch?
Avinash, enjoyable reading. Especially liked the point on reporting versus analysis. To add to above, would it not be valuable to i) think of every business decision as a choice, ii) phrase that choice as a specific question, iii) spend enough time on the wording, bent of the question till one is satisfied that it is infact the most relevant question, and then iv) look for data that may help answer that question?
To illustrate, if two people walk out of a meeting looking for solutions to the question – "What should we do to grow the company?", one could interpret that as "..grow net profit" and might take a look at cost data. The other might interpret the same as "..grow revenue" and might ignore cost data completely. What if both had the second interpretation?
Another example is the question "What discount in price should I give customer X?" (in response to a customer asking for a 5% discount). Lets assume facts show you this is a highly profitable customer you want to keep. Maybe the customer is really looking for some kind of financial compromise, and perhaps you can keep him happy by adding 30 days to his payment terms (and be better off). If you had asked the question "What additional incentive can I give my customer to keep his business?", you may have been looking for a different set of data.
My belief is that such situations are commonplace in organizations. We tend to hurry through the questions, and spend all our time time looking for answers. The big question may well be "For which question do I need an answer?".
I especially love #6.
Hi Avinash,
Great post that I definitely find a lot of truths in – and it have undeniably helped exemplify my take on it as well.
…and yes; I know I am way late in commenting. :-)
Cheers
Dennis
Dennis R. Mortensen, COO at IndexTools
My take on: Web Reporting vs. Web Analysis!
Great articles, plus I love the presentations on Youtube. I need advice ..
My HiPPO needs to be convinced that data driven decisions are more powerful than assumption based ones.
I need to supply irrefutable proof. I can give testimonials, invent projected conversion improvements, etc, but they're not strong enough since they apply to others or they are guesses.
Can anybody please share their own experiences or offer some advice?
Thanks very much,
Ernie
I stopped reading mid-way through because I just had to say, thanks for the common since tip that I think will help me to finally acclamate to my new company, "Hippo's rule the business world". I may have to put it up at work to remind me that in order to convince our Hippo's (and I know I won't be able to always) I'll need depersonalized figures, facts and analysis and this gathered from other Int'l Hippo's in order to get ideas and recommendations to be seriously considered in my company.
Also, I'm not a web analyst and yet I find your blog insightful for my purposes, competitive analysis, as well. Thanks!
Hi Avinash!
I'm late to your presentation about Creating Data Driven Culture, but it is still timely and relevant! I watched the vid on youtube recently, and I wish I'd known it was outlined here on your blog already. It woulda saved me alot of time as I'm sure your Notes are much better than mine! I was trying to capture both what you said, as well as the thoughts I had about what you said. It was like your talk was about the proverbial 'Elephants in the Room' that don't always come to the forefront conversation, and definitely not with the clarity with which you spoke them. Sometimes, the corporate dynamic has a way of over-complicating the simple, while over-simplifying the dynamic. Way too much going on in that vid – in a good way! :-)
I found a lot of valuable food for thought, but I especially love that you broached and responded to the question of WHO should own the data in organizations. I like your point that it should be Owned by the people who are responsible for (who's necks are on the line for) the Outcomes. Ideally, I see that as a co-ownership among multiple depts (i.e. Marketing, Sales, Production), not just a single dept.
Thanks for the great insights!
For anyone else just finding this, enjoy the video here: http://www.youtube.com/watch?v=OTu02Gab0Qw
Kaitrece
Avinash –
I have nothing to do with web analytics, probably I dont even know what it means. However I live, breathe and get paid for analysis.
I can drop the word "web" from your blog and it is still is right on money. Every word is so true for just any business's decision making. .Or should I drop the word "business" too. Thanks for sharing your thoughts.
Without power of data any decision making will only be derailed by the loudest or funniest guy in the room!
Mine is definitely coming years after this post, but I found this particular read -priceless. I train people in advanced analysis using Excel but most of the time, I fail to track my own business metrics. Analysis is good, but what is the percentage of these analytic that are used by corporate hippos? . I once worked for a 501(c) in Africa and it seems most of the data I generated remained in the C-level's inbox for months coz we believed analysis is the thing.
Nice read. I will bookmark this site.