top of page




  • Test plans and scripts

  • Research summaries and recommendations

  • Feedback overview

  • Customer satisfaction scores analysis

  • Infographics

  • Prototypes

  • Usability test reports

  • Usage analytics reports



  • Ecamm

  • Google form

  • Hotjar

  • InVision

  • Maxymiser

  • Skype

  • UsabilityHub

  • UXPin

  • WhatUsersDo


The goals of redesign were building a responsive, faster website to improve the overall reading experience, sample the breadth of The Economist’s content to persuade people to subscribe, improve the customer journey of visitors coming to the site to buy or renew a subscription and continue to sell advertising space. 

Prior to the project there was only some basic understanding of The Economist readers and the way in which they consumed the content through the website as well as though the apps and in print. Most of the previous research was focused within the marketing/advertising realm. Thus, the level of understanding of how users actually used was minimal.

Due to both structural and strategic changes in the organization, the delivery roadmap for this project was highly compressed, and the time allocated for generative research was very limited. redesign



The overall objectives of my research and testing strategy were improving our understanding of how readers used the website, uncovering pain points or bugs in the new, and discovering potential product innovation and commercial opportunities.

I needed to optimize documentation of results and compress the timeframe for research and testing making sure that in every sprint research helped our teams learn quickly and rapidly iterate during the redesign.

The main, long term objective was creating a system to understand trends in our readers satisfaction levels over time, create a readers’ panel, and set the basis for any future iterative development. 



As a starting point for my research process and to help to shape the focus of my primary research, I collected and analyzed all relevant UX research done for previous projects, marketing research and available analytics.


I conducted interviews with senior stakeholders from all relevant departments to build consensus and make sure we were working toward a common goal. Stakeholders were fully involved in my research project from the outset and were provided with regular updates as we progressed.


As part of the early discovery process, my team also ran several ideation workshops with colleagues in the editorial and commercial teams involving Edenspiekermann, a design agency specialized in digital news.

User interviews report


I identified key areas of interest to discuss with the participants and formulated questions to be used as conversation starters. The areas of focus included: browsing, searching and navigation, usage across mobile, desktop, apps and print, commenting, social sharing, importance of audio, video, images and infographics, perception of advertising and pay barrier, ‘My account’ area usage, ideas and suggestions. 

Through a short survey published on blog pages, I recruited and selected 15 readers from different geographic areas who had different types of subscriptions to The Economist. The readers participated in a thirty minutes interview that I conducted over the phone using Skype and recorded using Ecamm call recorder.

I then created a detailed research summary that highlighted patterns and trends I had identified across the research. These findings were discussed with the cross functional redesign team, and then fed directly into the product development process where they were reconciled with internal priorities.



Both myself and the Product Team strongly recommended an incremental release approach for the launch of the new This approach would mitigate the risks associated with what I believed was a scarcity of research and testing activity relative to the scope of the changes, and would give us the opportunity to iterate and improve the product before final release.

The website was released in 2016, shown at first to just 5 percent of its visitors, then incrementally  increasing the exposure over nine months to 100 percent.

With the help from the Product and Development Teams, I established a cost effective, efficient and valuable system to collect, analyze, manage and present readers’ feedback.


The UX team collected, categorized and analyzed approximately 20,000 comments from readers during the nine month implementation period.


  • A Google form to collect readers feedback, NPS, and Customer Satisfaction scores. The form was customized to also capture a set of other valuable data (e.g, user state, subscription status, user device, URL).

  • An Excel spreadsheet in which, for each Agile Sprint, the feedback were categorized according to a few main categories (e.g, navigation, search, log in, performance).

The spreadsheet also calculated positive and negative data and variations for each category, as well as the variations in the scores and produced charts and diagrams to show the overall performance of the website and customer satisfaction overtime.


The graphic representations of the findings were very helpful with stakeholder understanding and support.


Readers who left feedback through the form had the option to give consent to be contacted. Those who were willing to be contacted, besides getting a response by the relevant departments, were selectively invited to participate in research and test activities and become part of our readers' panel.


User collaboration was particularly important to the redesign, and their feedback shaped the new website.


For example, we found readers wanting more content on the homepage and weren’t happy with the content density of the page we had originally designed. As a result, we increased the number of links visible to readers, and the number of people clicking through from the homepage to another article increased by 8 percent.


Feedback, combined with metrics, were also essential to discover issues associated with new features as well as opportunities for improvement once the website was well established.


One challenge I faced was making the variety of feedback relevant to each team. To address this issue I created, for each Agile Sprint:

  • A two page ‘Feedback overview report presenting the scores, the most relevant feedback, a link to all the feedback received during the sprint, and the key findings and recommendations

  • A separate spreadsheet for the Customer Service Team populated with the customer service issues that readers were reporting using the feedback form

  • A one page physical copy of the most relevant feedback that I displayed across the office, kitchen area and conference rooms in order to show progress and maintain stakeholder engagement.

Variations in scores
User feedback
Most relevant user feedback
Whatusersdo interface


I created the testing plans, defined the scenarios, and wrote the scripts for several series of remote tests on using the WhatUserDo screen-sharing testing tool. The scope was to discover how the new website was perceived by first time viewers, how well was it fulfilling its business goals as well as identifying opportunities for improvement.

For each test, 10 participants selected from a panel (not Economist readers) were asked to look at and navigate on desktop or mobile and talk aloud about their impressions and thoughts. The participants were then asked to visit their favourite news website and describe what they liked/did not like on that site and why. The participants were also asked to perform a few simple tasks such as share an article, renew their subscription, and give an Economist gift subscription. Test sessions lasted up to 20 minutes and were video recorded.

I also used remote tests to discover usability problems in the navigation and in new interactive features prototypes built using InVision and UXPin; the key findings of these tests were fed directly into the iterative product development process.



A core goal of redesign was to persuade people to subscribe and improve the customer journey of visitors coming to the site to buy or renew a subscription.


I worked closely with the Marketing and Circulation team in the creation of hypotheses for the types of content changes that could impact conversion, and in building efficient A/B testing plans. The A/B tests focused on the effectiveness of pay and registration barriers, internal advertising, and call to actions. These tests were conducted on a regular basis using Maxymiser.

To refine messaging and design to optimise conversion rate I also ran several sets of ‘first click’, ‘preference’ and ‘5 seconds’ tests using UsabilityHub.



As a complement to other types of research, to analyze browsing and clicking behavior, particularly on the homepage, and to compare the findings with analytics data available, I helped my team set up heatmaps and visitors recordings using HotJar.


I created a template for the findings to be shared and discussed with the broader Product Team, and Hotjar reports became part of the ongoing research activity for  

Hotjar interface
bottom of page