Monday, September 12, 2011

A Web Analytics & PDCA case: improving marketing websites in a pan-EU context

In the last post, I presented the Plan-Do-Check-Act (PDCA) approach and how it can be applied to Web Analytics in order to create a Web analytics culture – not just a Web measurement culture. In this post I would like to illustrate this approach with a practical example from my personal experience.

As I learned PDCA when I was working at Toyota Motor Europe (“hi ex-colleagues! “ – in case your social media monitoring picked this post :-)), I will start with a example from my previous job, where PDCA was applied in rather large web analytics-related project, involving many stakeholders and long iteration cycle.

Improving overall online marketing performances in a pan-EU context
First, a bit of background and context.  As many global organizations, Toyota European’s sites are run a on central platform, using common tools and content architecture provided by the European headquarters but the sites and marketing activities are managed on a local level, by the respective national companies & the local Internet marketing managers. Regarding Web analytics, the headquarter offers not only the tool and measurements but also global reporting, analysis and support. That was part of my previous job.


The project: the European internet marketing team wanted to improve the overall efficiency and the quality of online marketing sites, by increasing the use and adoption of online data (i.e. instilling a Web analytics culture :-)). Couldn’t be more challenging!

PLAN step
 The plan phase started with the definition of a set of online key performance indicators (KPI's) directly based on common and well-known (online) objectives of the marketing sites: support customers during the different steps of the purchase process. So for each stage of the funnel, we identified a set of corresponding online metrics. We defined how these would be measured and the corresponding data sources.

The proposed KPI’s were presented to the stakeholders (i.e. marketing managers of more than 25 markets) and discussed until a consensus was reached.  This took a few months as you can imagine but having stakeholders buy-in was crucial – otherwise no way they would support and use the provided indicators.

We agreed also on a communication process and format: monthly dashboard to track progress and check points (the analysis part) every six months.  Markets were regrouped in different in groups based on similar characteristics (market size & share, internet penetration...) in order to be able to benchmark them against each other.

DO step
In the initial iteration, the required tracking was implemented, data was collected and gorgeous dashboards were create and delivered on the agreed frequency.

On following iterations, the “DO” phase also included the implementation of identified key actions (see further).

CHECK step
 That’s where things got serious. That’s where reporting turned into analysis. Every 6 months, results for the whole period were analyzed and presented to the stakeholders – by group.

The KPI’s helped identify areas that needed improvements for each market but also the “best of class” within the group.  Next step was to understand why these good “pupils” were doing so well and see if there was anything the group could learn from them. This usually lead to opportunities that could be replicated and bring benefits to others (it's called “Yokoten” in Japanese :-)).

Reviewing the results sometimes also pinpointed actions that did not worked and brought poor results – helping others avoid doing the same mistakes.

ACT step
 In the end, we came out with key areas to be improved and a list of recommendations such as marketing activities that could be done, content and tools to be added to the site and more.

A consensus was reached with each market on the areas to improve, the priorities – usually focusing only on 1 or 2 area per cycle. Targets to be reached were also defined (very important!).

At this point, it was up to the market to create its own action plan and to communicate it back so progress could be monitored.

In order to share best practices, findings were communicated and presented on a regular basis to all markets - even non-participating ones. From there, a new PDCA cycle started - each market having a 6 month-period to implement as much actions as possible until the next check point.

Did it work?
 This process brought very positive results in the sense that it indeed increased adoption and usage of online data. The first cycle started with a limited number of markets but as outcomes were presented to general meeting, more and more markets joined the process over the years to have in the end the majority of them.

But it also contributed in increasing the general efficiency of the marketing sites across Europe as progress was measured in many cases – even if not always reaching the targets. It also incited many markets to add valuable online services and content, increasing the overall quality of their online channel.


An important learning point was also that - even once you have reached a good level of adoption - you have to continue investing time to maintain the process, to be behind people, motivating them (sometimes harassing them). Otherwise, the results of your efforts may quickly fade away.

I hope you have found this example interesting and that I managed to illustrate how PDCA can be applied for Web analytics in a large-scale context. And how it can drive measurements, reporting and more important action-driven analytics and results!

In a future post, I will cover another example within a much smaller context with short iteration cycles. Feel free to share your comments, doubts (if any) or if you have any question. I will be delighted to answer you.

Related posts & resources:

1 comment:

Feedback Form