To enable fast and effective data-led decision-making, the FT has been on a journey to extract better value and competitive advantage from its data.
Last year, a partnership was forged between the Analytics and Internal Products teams to implement a new analytics tool targeted at the product team. The intention was to change the way the two teams worked together.
Having led the initiative from the product side, what follows is the journey and 6 lessons learned from introducing self-serve analytics at the FT.
The decision to focus on implementing a tool specific to the Product team is because they play a key role in delivering the FT’s subscription strategy by creating and developing customer-centric products to grow lifetime value. To do this, they need to be able to make effective and timely decisions using real-time and historical data.
The results
- Prior to the introduction of Amplitude, the time to insight was 3–5 weeks. Product Managers had to wait in the queue for an available analyst to answer their query. This meant they couldn’t always focus on the priority hypothesis as they had to complete other tasks while waiting for the analysis to be done. Now, the time to insight is 15 minutes. They can answer most questions they have by themselves.
- Amplitude is itself an off-the-shelf tool that was purchased. However, the success in driving adoption depends on a host of factors: What content is in the tool itself, user interface, training, onboarding, support and communications. This overall approach led to the highest NPS score within the Internal Product’s suite of products.
- The way of working has changed. Through observing how users were using the tool in their roles, we uncovered three types of emergent behaviour.
Using Amplitude has changed behaviour and decision-making processes and there is now an ability to link qualitative and quantitative methods by the product team.
The view from a user's perspective
We saw three common themes emerging from 30 user stories.
Theme 1: Faster experimentation
The team sees results in real-time, supporting quicker spotting of issues, ability to stop or move on sooner.
This matters because… The product team can focus on following through on priority hypotheses rather than completing other work while waiting for analysis. By that time, the opportunity could have passed.
Example: A product team ran an A/B test relating to the navigation menu. This was scheduled to run for 3 weeks. However, after 1 week, the team noticed that the test was underperforming significantly. The results proved the team’s hypothesis that users were not aware of the link. The team decided to move to the second phase of the test without having to wait for the two remaining weeks.
Theme 2: Users are empowered to answer their own questions
They can explore data themselves to identify opportunities to invest or prioritize.
This matters because… Simply exploring a reliable data set that is understood, on one’s own, has resulted in many ah-ha moments where our users say they feel more grounded in the data and more confident.
Example: A funnel chart that was created on Amplitude showed that one particular page experienced the biggest drop-off across the page journey with half of the people leaving before getting to the next section. This suggested that a re-evaluation of the page should be considered. The team made the changes and ran another A/B test, removing the original page they’d put in. This proved to be a massive success, increasing completion rate significantly and was something the team could monitor on their own.
Theme 3: Opens up visibility
As users have less reliance on the analytics team, they are able to better engage with stakeholders. There is no licence limit to how many users use Amplitude so anyone can access the reports.
This matters because… Users can mediate conversations from sometimes opinionated stakeholders from a stronger position because they can pull up relevant data points on their own.
Example: An embedded analyst did some analysis where the Engineers on the same team felt like the numbers didn’t quite add up. The Engineers used the Amplitude dashboard to flag things that the analyst then changed. Having the same data points meant the onus was not just on the analyst to get things right but rather a collaborative effort.
What’s next
With a new way of working now established, the tool is now owned fully by the analytics team for ongoing support. The continued focus on understanding how users are evolving with the tool, plus usage data will determine future priorities.
6 Lessons learned
- Define user needs and maintain a razor sharp focus on these. This led to a much more specific understanding of the various user problems and resulted in tailored designs of communications, training, onboarding and post-onboarding support. This increased adoption. Being connected to user needs at all times through a combination of user interviews and usage data helped us understand how specific roles were using the tool and it helped the team iterate to build the habit of continually using the tool.
- Reduce complexity. Analytics team members on this project led the charge to review the several hundred events and determine that most of the product team’s key questions could be answered by having page view and cta click events, thus reducing complexity and with the aim of increasing curiosity, usage and eventual adoption of something new. After users got comfortable with the tool, we started adding other event types like signup events, while still keeping to the principle of not introducing all the events, to reduce complexity.
- Make it easy for non-analytics team members to understand event names. Rounds of user testing were completed and some events were renamed and event descriptions were meticulously added to better enable non-analysts to easily understand what the events are.
- Stagger the launch of the tool to improve the user experience each iteration. Our first launch was to 4 users (Early adopter Engineers and Product Managers). This helped us clarify use cases, optimize training content and be clear on where the most value was coming from. In subsequent launches, we focused on understanding the priority questions that a team was looking to answer, determined which events to bring in and then reviewed how it was being used.
- Incorporate change management thinking with product thinking. We defined the user behaviour that needed changing and then ensured all our touchpoints would change this. For example, the product managers needed to be more aware of when to involve the tracking implementation team and their role in supporting better data quality (rather than the onus being just on the analytics team). Communications were designed to support this message whenever we interacted with them about Amplitude.
- Identify an advocate and keep things visible. The Product Leadership Group — PLG (the Chief Product Officer and her direct reports), had a keen interest in ensuring their teams were self-serving. We would attend periodic PLG meetings to share progress and identify where they could help.
Many thanks to a wonderful internal product-analytics partnership that helped drive a positive change in how the FT builds products.