B2E Internal Platform · Retail Onboarding · AI Learning System · 2024

AI Tutor

Retail educators were expected to retain a growing volume of product knowledge quickly, but static onboarding materials were not helping that knowledge stick.

I designed AI Tutor, a mobile learning system that replaced passive training with interactive modules and GPT-4-powered scenario practice.

At a Glance

100%

Product Knowledge Retention

Observed during usability testing after scenario-based training

Delivered

Prototype Handed Off

Final prototype delivered to Lululemon's development team for potential implementation

~25-35%

Faster Onboarding (Estimated)

Estimated reduction from replacing passive training with guided scenario practice

AI Tutor app screens

TL;DR

AI Tutor helped educators retain and apply product knowledge during onboarding through scenario-based learning, replacing passive training with contextual practice.

Project Info

Project

Sponsored Informatics Capstone project by Lululemon Athletica

Role

UX/UI Designer and Researcher

Team

1 UX/UI Designer, 1 Data Scientist, 1 Front-End Developer, 1 Back-End Developer, 1 PM

Timeline

01/08/2024 – 05/31/2024

Description

Led mobile UX design, user testing, and research synthesis from low-fi to high-fi prototype.

Prototype available here.

Product Preview

Context

When growth outpaces learning

Lululemon educators are retail associates responsible for delivering personalized guest experiences. As the brand expanded across 700+ global stores, onboarding required associates to absorb more product knowledge in less time.

Two findings from interviews and observation sessions directly shaped the product direction.

Lululemon store

The Real Problem

From better onboarding to better recall

What it looked like

Lululemon needed a better onboarding tool.

What I actually had to solve

The real issue was not just content delivery. Educators needed to retain and apply product knowledge in realistic guest interactions, not just move through static modules.

A cleaner onboarding tool would not have solved the real problem. The real work was helping educators retain product knowledge and apply it confidently in real customer interactions.

Constraints

What shaped the direction

Time constraint

The product had to be designed, prototyped, and tested within a four-month capstone timeline, so we had to prioritize the highest-value learning flows.

AI feasibility

Real-time AI interactions had to remain lightweight and structured enough to feel useful on mobile without degrading performance.

Brand alignment

The interface needed to feel like part of Lululemon's internal ecosystem, not a separate experimental app. Familiarity was non-negotiable.

Tradeoff summary

These constraints pushed the product toward structured scenario-based learning rather than a fully open-ended AI tutor. A tighter scope that turned out to be the right call.

Research

I combined survey responses, educator interviews, and competitive analysis to understand what was making onboarding hard to retain and apply.

7 Survey Responses6 Educator InterviewsCompetitive Analysis

What educators said

What do you think about your current training process?

“I want less training videos, more hands on experience, because training videos are hard to remember”

“Online training makes it difficult for me to ask questions and get a timely response”

Survey result

How likely are you to recommend the current training program to others?
Scale: 1 = extremely likely, 5 = not at all likely  ·  7 responses

1
2
2 (28.6%)
3
4 (57.1%)
4
1 (14.3%)
5

What adjacent AI learning tools suggested

A review of Quizlet Q-Chat, Microsoft Reading Tutor, and Walmart's Ask Sam showed that AI can improve training by making content more adaptive, contextual, and easier to retrieve in the moment. This helped validate our direction for AI Tutor as a more personalized alternative to static onboarding materials.

Quizlet Q-Chat competitor analysis
Quizlet

Q-Chat

AI tutor that uses questions to prompt active recall instead of passive reading.

Takeaway

Retrieval practice beats passive review for retention.

Microsoft Reading Tutor competitor analysis
Microsoft

Reading Tutor

Adjusts content difficulty in real time based on learner performance.

Takeaway

Adaptive content keeps learners in the right challenge zone.

Walmart Ask Sam competitor analysis
Walmart

Ask Sam

Helps associates retrieve product information in the moment while working.

Takeaway

Just-in-time knowledge retrieval is more useful than pre-loaded training.

The common pattern was not just using AI, but using it to make learning more responsive, contextual, and useful in the moment. That insight helped shape our direction for AI Tutor.

Research Key Insights

🙋

Educators learn by doing, not by watching

Participants preferred hands-on training over hours of passive videos and reading materials.

Insight visualization - educators learn by doing
📋

Quizzes pass everyone through, regardless of knowledge gaps

The current format doesn't adapt to performance, educators move on whether they scored 4/10 or 10/10.

Quiz data visualization - knowledge gaps

User Persona

Two distinct educators shaped the design direction

From interviews and surveys, two educator archetypes consistently emerged — each with different learning patterns, frustrations, and needs.

Michael Lee
New Educator

Michael Lee

Basic Info

  • Newly Hired Lululemon Educator
  • 21 years old college student at the University of British Columbia

Needs

  • Wants a personalized learning that aligns with his strength
  • Retention of product knowledge
  • Engaging and relevant training material
Sydney Johnson
Store Manager

Sydney Johnson

Basic Info

  • Lululemon Store Manager
  • 32 years old White American
  • Bachelor's in Communications

Needs

  • Recognizes challenges in consistently delivering Lululemon's value due to frequent staff changes.
  • Recognizes Educators' difficulties in retaining and accessing product knowledge.
  • Sees need for a training system to reinforce and improve Educators' knowledge.

KEY DESIGN Decisions

Five decisions that shaped the product

Each decision below includes the problem it solved, what I considered, why I chose this direction, and what I would do differently.

Decision 01

Making modules easier to learn from

Problem

Early testing showed that modules felt too text-heavy, which made content harder to scan, reduced engagement, and weakened knowledge retention.

Considered

Keeping modules mostly text-based versus redesigning them with richer visual hierarchy, product imagery, and embedded video.

Why this

I introduced visual product images and short videos within modules so educators could absorb information more easily and connect training content to real products. This made the learning experience more scannable, engaging, and easier to retain.

Next time

Test which content formats actually improve retention most, for example, whether educators learn better from short video, product imagery, or more structured visual summaries.

Three module screens showing visual redesign progression

Decision 02

Add pre-landing pages to orient first-time users

Problem

Users did not know where to begin or how the learning tools connected. They landed directly on feature pages with no context.

Considered

Landing users directly into feature pages versus introducing each feature with a short orientation layer that explained its purpose and recommended starting point.

Why this

Pre-landing pages reduced confusion and helped educators understand the recommended learning path before starting. First-time orientation is cheap to design and expensive to skip.

Next time

Test whether experienced educators still needed this orientation layer, or whether it should adapt by user type and hide itself after the first session.

Before

Educators landed directly on feature pages with no context, making it hard to understand what each section was for or where to start.

After

Pre-landing pages now introduce each feature and guide educators to the recommended starting point, reducing confusion from the first tap.


Before: no orientation

Decision 03

Reinforce Lululemon's culture through Kudos

Problem

Onboarding at Lululemon is not only about product knowledge. It also needs to reflect the team-oriented culture that shapes how educators learn and work together.

Considered

Keeping AI Tutor purely instructional versus adding a lightweight social layer that acknowledged progress and peer recognition.

Why this

I introduced Kudos to make the learning experience feel more aligned with Lululemon's culture, where encouragement and community are part of how teams grow. This extended the product from a training tool into a more motivating and culturally familiar experience.

Next time

Test whether recognition features improved continued engagement over time, or whether they worked best only at specific milestones like module completion or first quiz pass.

Educators could search teammates, view achievements, and react with lightweight recognition.

Decision 04

Separate the AI Chatbot from the Scenario Quiz

Problem

Users confused the AI Chatbot with the AI Scenario Quiz and did not understand their distinct roles. Both tools were listed side-by-side at the same level.

Considered

Keeping both AI tools side-by-side versus giving them separate jobs in the system — one as a persistent assistant, one as a gated practice tool.

Why this

The chatbot became a persistent assistant for quick questions, while the scenario quiz lived inside modules for guided practice. That made each tool's purpose unmistakably clear.

Next time

Explore whether one tool could eventually subsume the other without reintroducing ambiguity — the distinction added clarity but also added cognitive overhead.

Before

Both AI tools were listed side-by-side on the same level, causing educators to mix up their distinct purposes and not know which to use.

After

The chatbot is now a floating persistent assistant available everywhere. The scenario quiz lives inside each module, making both roles unmistakably clear.


Before

Decision 05

Turn quiz results into a learning moment

Problem

A score alone did not help educators understand what to improve. The results screen showed a number and nothing else.

Considered

Simple pass/fail feedback versus detailed corrective guidance that explained what went wrong and where to go next.

Why this

The redesigned results page explained incorrect answers, surfaced the right response, and linked users back to the exact module to revisit. Failure became a directed learning moment instead of a dead end.

Next time

Test whether educators preferred corrective feedback immediately after each question or after reviewing their full result set — timing of feedback may matter as much as the content.

Before

The results screen simply showed a score. Educators had no way to understand what they got wrong or how to improve their knowledge gaps.

After

The redesigned results page now explains each incorrect answer, links to the relevant topic, and surfaces the exact module to revisit, turning failure into a learning moment.


Before

Outcome

What changed

Across two rounds of usability testing, AI Tutor helped educators retain and apply product knowledge more effectively than static onboarding materials. The strongest signal was 100% product knowledge retention observed after scenario-based training. The final prototype was delivered to Lululemon's development team for potential implementation.

If I were measuring this next, I would track long-term retention, confidence in live guest interactions, and whether educators could transfer learning from one scenario to another.

100%

Product Knowledge Retention

Observed during usability testing after educators completed scenario-based learning modules

25-35%

Faster Onboarding (Estimated)

Projected based on reduced reliance on static training materials. Not yet validated in production.

Improved Educator Confidence

Educators reported greater confidence recommending products after practicing real customer scenarios in testing

Reflection

What I got wrong first

I initially thought the problem was making onboarding more engaging. The deeper challenge was helping educators remember and apply product knowledge in context. With more time, I would test retention over a longer period and compare whether educators could transfer what they learned from one scenario type to another. This project taught me that training design is not about making content more interactive for its own sake. It is about creating the conditions for recall, confidence, and real-world use.

next project

Reshape - Fitness App

Designing a gamified streak & reward system for a live health app to reduce 30-day user churn.

Product DesignMobileHealth

End-to-end product and UX designer crafting thoughtful digital experiences.

Built with Readdy & iced matcha lattes 🍵