top of page

Driving User Engagement Through Contextual Sign-In 

The Challenge

BBC content is mostly accessible without requiring users to sign in, which means many people browse anonymously. This limited the organisation’s ability to personalise experiences or understand user behaviour. To address this, the BBC formed a new Growth team focused on using experimentation and data to uncover user needs, increase sign-ins, and deliver more value to both users and the business.

My Role

As the Lead Product Designer on the BBC’s Growth team, I led the end-to-end design process from strategy and ideation to hands-on implementation and live experimentation.

​

Working closely with my Senior UX Designer, we shaped the overall user experience, ensuring our design decisions were grounded in user research, driven by data, and optimised for rapid testing

​

As part of a newly formed team, we collaborated cross-functionally with product managers, engineers, creative directors, editorial and ethical teams, as well as other BBC product teams who owned components critical to our experiments.

 

Our goal was to design and deliver contextual sign-in experiences that increased Weekly Active Accounts (WAA), unlocked personalised content, and enabled high velocity A/B testing across BBC Sport and News.

Screenshot 2025-06-16 at 21.55.15.png

Goals

  • Increase user sign-ins without harming content consumption.

  • Support a higher volume of experiments on BBC products, such as Article pages, BBC Live, Sports & News

  • Achieve a minimum 3% uplift in key engagement metrics.

Research & Insights

Looking at past and new research 

To guide the redesign of the sign-in experience, I carried out a combination of quantitative and qualitative research:

​

Quantitative Analysis

I reviewed internal survey data and A/B test results, which consistently showed that users were unlikely to sign in unless it was mandatory. Optional sign-in prompts had low engagement and did not drive meaningful uptake.

​

Qualitative Feedback

I analysed user complaints submitted via Capita, our third-party feedback platform. The feedback highlighted several key issues:

  • Sign-in prompts were often ignored or seen as disruptive.

  • Users did not understand the purpose or benefits of signing in.

  • There was a general lack of clarity and trust in the value proposition.

image.png

Work Shop Highlights

To translate research insights into actionable design strategies, I co-led a three-day workshop with cross-functional teams from Product, UX, and Accounts. The aim was to align on user needs, business goals, and feasible solutions to improve the sign-in experience.

Day 1: Journey Mapping

We analysed the existing sign-in journey to identify key friction points, mapping user motivations, behaviours, and common drop-off stages. This helped us pinpoint where—and why—users lost interest or abandoned the process.

Screenshot 2025-06-23 at 16.28.29.png

Day 2: Theming & Prioritisation

​We developed “How Might We…” statements based on patterns identified in user feedback, then grouped insights into key opportunity areas:

  • Communicating the value of signing in

  • Feature based incentives to encourage sign-in

  • Comparing modal versus embedded sign-in patterns

​

We used dot voting to prioritise these themes based on perceived user impact and strategic value.

Screenshot 2025-06-23 at 16.36.44.png
Screenshot 2025-06-22 at 01.53.48.png

Day 3: ICE Scoring & Experiment Design

We applied the Impact–Confidence–Effort (ICE) framework to prioritise our ideas, focusing on experiments that offered high potential value with low implementation effort.

​

  • One example hypothesis: “If we offer a ‘Save for Later’ feature behind sign-in, more users will sign in to access saved content.”

​

After the workshop, we moved into rapid prototyping and user testing to validate these assumptions. This iterative approach helped de-risk ideas early and ensured we were designing based on real user needs—not just internal opinions.

Outcome (Expectation vs Reality Sign-In Experiments Roundup)

  • Within 9 months we completed 17 experiments — 16 of them focused on improving sign-in. Some delivered big wins. Others… well, not so much

  • Across 5 experiments, MSI (Mandatory sign-in) consistently delivered the highest uplift in sign-ins compared to other formats like banners or dismissible prompts.

  • The strongest result came from the Sport Article Limit MSI (sign-in required after 3 articles per day) which saw a +4.45% uplift with only a small drop in page views (-1.58%).

  • Dismissible Modals? Most users say “Maybe Later”

    • We tested 4 DSI (Dismissible Sign-in) modals — and 98% of users clicked ‘Maybe Later’. Almost no one actually signed in.

  • Abandonment is real 

    • ​Across 5 MSI experiments, around 30% of users start sign-in but don’t complete it.

What I Learned

  • Owning the full process from ideation to live implementation — enabled faster iteration and more focused design decisions.

  • Experimentation is cross-disciplinary. The team built stronger cross-functional ways of working, especially around experiment design, hypothesis setting, and interpreting results.

  • Collaboration fuels speed. Tight alignment with Product, Engineering, and Analytics helped us launch quickly and measure impact accurately.

  • The retrospective gave Product, Design, Engineering, Data, and Editorial a shared, safe space to surface friction points. It helped us identify key areas for improvement — from process and communication, to tooling and ways of working.

  • Fake-door tests are useful, but without a rich post-click experience (not just clicks), we miss out. Simple surveys would have allowed us to gather more meaningful user insights.

  • Copy, timing, and placement of prompts significantly influenced user behaviour — with unclear CTAs often leading to confusion or drop-off.

​

Other Learning 

  • Over time, repeating similar sign-in experiments using the same modal designs and messaging created fatigue within the team, highlighting the need for variation and new ideas.

  • We began building reusable sign-in modules, such as flexible MSI modals and feedback capture components, to support faster, more scalable experimentation.

What Happened Next:

The team prioritised MSI experiments with clearer messaging and paused or redesigned less effective formats like DSI. Feedback collection improved, and targeted user research helped close important UX gaps. Reusable components were standardised, and documentation was centralised to speed up future tests. Collaboration with the Accounts team began to tackle sign-in drop-offs and boost conversion rates.

Follow Me

  • LinkedIn
bottom of page