Designing AeT’s First Student and Educator Dashboards

Pearson

Turning raw performance data into clear, actionable insights across web and native.

Context

Pearson’s Assignable eText (AeT) product provided strong assessment and reading experiences, but lacked a dedicated space for students and educators to understand performance, progress, and assignment trends.

Before this project, there were no AeT dashboards — no centralized view, no early signals for educators, and no clear daily priorities for students.

As the Senior Product Designer, I was responsible for designing AeT’s first Student and Educator dashboards from zero to one across web and native. This included defining the IA, insight hierarchy, flows, and the design system patterns needed to support the experience.

Problem
Educators needed:
  • A quick view of class health

  • Early identification of at-risk students

  • Trends across topics and assignments

  • Clear drill-downs for deeper insight

Students needed:
  • A single place to understand what’s due

  • Clear prioritization (today, overdue, upcoming)

  • Consistent guidance across web and native

  • Support when falling behind

AeT had the data but lacked a unified experience to surface insights in a way that supported real teaching and learning workflows.

Research Insights

Partnering with research, I conducted interviews with 13 educators and additional student sessions to understand their mental models, workflows, and priorities.

Educator Insights (MoSCoW Prioritization)

Must-Have for MVP:

1. Assignment-Level Histograms

Need score distribution, not averages.

“I want a histogram of completion, not just the average.”

2. Student & Topic Drill-Downs

Compare students, assignments, and topics.

“I want to compare students across assignments and topics.”

3. Behavior-Change Detection

Early visibility into drops in engagement or performance.

“I want to see changes in student behavior.”

Should-Have:

1. Topic-level performance patterns

2. Class-section comparisons

Could-Have:

1. National benchmarks

Won’t-Have:

1. Raw time-on-platform metrics (too misleading)

Need score distribution, not averages.

“They’ll just log in for 3 minutes to look active.”

Student Priorities

1. What’s due today

2. What’s overdue

3. What’s upcoming

4. Review

Approach
1. Design for decisions, not data

Every insight needed to answer, “What should I do next?”

2. Unify the system across web + native

Shared card layouts, spacing, and navigation ensured consistent behavior.

3. Reduce cognitive load

Surface essentials first, with drill-downs available when needed.

Final Experience

Educator Dashboard (Instructor Insights for AeT)

These screens show how the Educator Dashboard supported data-driven teaching across class, assignment, and student levels.

Class Overview — Desktop & Mobile

Gives instructors a clear snapshot of class health, students needing support, and topic-level challenges. This helped educators identify issues early and prioritize where to intervene.

Assignment Drill-Down — Desktop & Mobile

A detailed breakdown of assignment performance, including score distributions, question-level trends, and completion rates. This enabled teachers to quickly understand where students were struggling.

Student Drill-Down — Desktop & Mobile

A focused view of an individual student’s engagement, performance, and risk indicators. Educators used this screen to offer personalized support and track progress over time.

Behavior-Change Alert — Desktop & Mobile

Automated alerts highlight meaningful shifts in engagement or performance. These notifications gave educators early visibility into drops in progress and allowed timely intervention.

This became the central hub for understanding class performance within AeT.

Student Dashboard (AeT)

Designed to reduce confusion and help students focus on what matters most.

What’s Next — Desktop & Mobile

A quick view of upcoming tasks and actions so students always know their next step.

Assignments — Desktop & Mobile

A structured breakdown of due, overdue, and completed work to help students stay organized.

Review Mode (Flashcard Press) — Desktop and Mobile

A dedicated space for reinforcing concepts, reviewing missed questions, and strengthening understanding.

Review Mode (Topic Press) — Desktop and Mobile

A dedicated space for practicing topic-level multiple-choice questions, reviewing missed answers, and strengthening understanding.

Students quickly understood priorities and next steps.

Design System Contributions (Web + Native)

Because AeT had no dashboards before this project, many required DS components didn’t exist.

These became foundational elements in the Pearson Design System and supported later theme work, including Sepia Mode

Insight Cards

A scalable pattern for surfacing key insights across web and native, providing a consistent summary layout for instructors and students.

Performance Charts

Unified histogram and trend components that standardize how performance data is visualized across AeT and reduce inconsistency across teams.

Alert & Prioritization Indicators

A clear system for highlighting at-risk students and urgent tasks using consistent color, hierarchy, and status tokens.

Outcomes
For Educators:

• Faster identification of at-risk students

• Clear visibility into assignment and topic trends

• Predictable drill-down patterns

• Components reused across learning teams

For Students:

• Clear daily priorities

• Better engagement with remediation

• Reduced friction across devices

For AeT + Design System:

• New insight + chart components

• Stronger cross-platform alignment

• Scalable patterns for future insight features

• Components reused across learning teams

Next steps

AeT Insights has successfully passed its White Glove milestone, the final internal review phase before release.

The dashboards are moving into production rollout, with the first live release scheduled for December 21, putting the new Student and Educator experiences into the hands of real users for the first time.

Next phase focusing on:

• Early engagement monitoring

• Refining drill-down depth

• Scaling insight patterns across AeT