2024・EdTech・Web
Chegg / AI-powered Guided Practice
Designed an AI-driven feature from 0–1 to drive engagement, tackling ambiguity and tech constraints.
Team
1 Content designer, 1 UXR, 1 PM, 1 Product MKT, 2 SWEs, 1 Learning experience designer, 1 Machine learning engineer
Role
Product design, Product startegy, UX research
Timeline
2024.5-2024.7
Problem statement
How might we deepen student engagement beyond just viewing solutions?
Final Solution
Introduced a structured step-by-step guided practice feature that allows students to reinforce their learning progressively.
Explore
Entry Point - Caught the Right Moment
Working with learning experience designer, I designed multiple practice entry points to capture students at the right learning moment: immediate practice, practice variations, and topic review practice.
Pattern - Balanced Engagement with Ease of Use
For the interaction part, I explored different modes:
Standalone practice module (structured, clear but rigid)
Conversational practice (interactive but requires high engagement)
Whiteboard-style open practice (flexible but harder to guide)
After evaluating feasibility with engineering and learning experience designers, we aligned on a structured step-by-step approach. balancing engagement with ease of use.
Features - Guided, Interactive
Based on that, I developed some interactive prototypes.
️Instant feedback ⚡
Reinforces learning by providing real-time insights, helping students correct mistakes immediately.

Flexible options 💡
Offers options like hints and reveal the answer to ensure students feel supported and flexible while maintaining challenge levels.

Varied question formats 🧩
Caters to diverse learning question type like multiple choice question, fill-in the bank, drag and match to Keeps the experience engaging and dynamic.

Iteration 2: Added more delight to provide more support
Optimized the UI
Provide more emotional support by showing more empathy when they get the wrong answer and celebrating their success
Execute
As the design evolved, I collaborated with Machine Learning Engineers and learning experience designers to explore using AI prompts for generating practice questions.
Technical challenge 1: Chegg’s expanding question pool, diverse query types required unique formats, and limited time
To address this, we focused on an MVP, prioritizing procedural questions (the most common) and multiple-choice formats (widely applicable), delivering a functional solution with plans for future expansion.
This led us to explore whether we could generate content parts incrementally instead of waiting for everything to generate together.
To communicate this idea, I create the flow of the logic:
And later we realized this approach would also help save the cost of token, since we only generate the next step when students take the actions. Which is a win-win for product and business.
Future scalability
Meanwhile, our team was re-evaluating the product architecture based on user feedback—whether to go with an all-conversational UI or a hybrid model. To adapt to both possibilities, I created two design versions to fit each approach.
Learnings
Navigating Ambiguity
Continuous validation and iteration are crucial to refining solutions and ensuring they address both user needs and business goals effectively.
Proactive Collaboration with Engineering
I realized the value of staying proactive by working closely with engineers to explore feasible alternatives together. It helped ensure the design aligns with technical capabilities.

















