Program Assessment & Student Evaluation Design

An innovation design and research project partnering with the University of Illinois College of Medicine to manage data collection and analysis

Project Overview

The University of Illinois College of Medicine (UICOM) is split between 3 campuses, each with its own data management practices, presenting challenges to operate UICOM as a single integrated system.

The decentralized decision-making has led to varied data management practices across campuses, complicating efforts to track individual student progress and curricular content efficiently and accurately.

Our goal was to create a strategy to align stakeholders on standard data management practices, considering the needs of different campuses and various stakeholders, enabling accurate tracking of student progress and curricular evaluation.

Timeline

Full time from October 2021 to March 2022

My Role

As a UX Designer, I collaborated with a team to conduct interviews, develop research stimuli, and create wireframes and prototypes to unify the University of Illinois College of Medicine's data management practices.

Throughout this project, I honed my UI/UX design skills, deepened my expertise in user research and concept testing, and enhanced my ability to iterate on designs based on stakeholder feedback.

Stakeholder Interviews and Goal Alignment — 3 weeks

My team and I kicked off the project my team and I by conducting interviews with 12 key administrative stakeholders, including regional deans, the dean of education planning and improvement, and related administrative roles.

The purpose of these interviews was to uncover the initial needs and desires from a high-level perspective. Additionally, we researched the stages of medical school to gain a comprehensive understanding of the student journey, from admission to residency matching and graduation.

This foundational research was crucial in guiding our approach to developing a unified data management strategy for UICOM.

Phase 1: Discovery and Alignment

Faculty & Student Interviews — 6 weeks

We then conducted interviews with faculty and students using a show-and-tell approach.

The goal was to summarize the kinds of capabilities and functionalities that users at each campus sought in a data management system.

Through these interviews, we identified ten key themes and developed "How Might We" statements to address them.

For example, we explored how to decouple data from specific personnel to make it more accessible, how to motivate faculty to provide quality feedback, and how to connect curriculum learning objectives to specific learning experiences.

These insights helped us understand the diverse needs and challenges faced by faculty and students across the different campuses.

Phase 2: Uncovering User Needs

We might be engaged in any one of a laundry list of things that we have to manage, and we’ve got maybe three people who do that.
— Data Management Team
How do we incentivize faculty to spend sufficient time in reflecting on a student to write something to create a narrative that would have greater overall value?
— UICOM clerkship faculty member
We’ve got two issues. One is that we haven’t really educated people about what our program objectives even are, the competencies that they should be mapping to. But then also they don’t have a place to map them into when they do start to get educated.
— UICOM Phase 1 faculty member

Theme: Reliance on Specific Personnel

Current assessment and evaluation systems are overly reliant on specific personnel, creating a bottleneck in data flow.

How Might We…

decouple data from individuals and make data more accessible and transparent across COM?

Theme: Quality of Narrative Feedback

Faculty need motivation to provide high-quality feedback, especially with STEP 1 going Pass/Fail.

How Might We…

empower and motivate clinical faculty to give quality formative and summative feedback of students?

Theme Comprehensive Curriculum Map

A comprehensive map of where and how learning objectives are taught, practiced, and assessed should sit at the heart of our data strategy.

How Might We…

connect curriculum learning objectives to specific learning experiences in a way that fosters collaboration between faculty and campuses?

Phase 3: Designing for Impact

Faculty & Student Interviews — 3 weeks

From December 2021 to February 2022, we focused on prototyping and testing our ideas.

Collaborating closely with College of Medicine faculty, we used research stimuli sketches to determine which screens and functionalities would be most useful.

Research stimuli for Round 3 interviews

Wireframing — 3 weeks

Based on these insights, I developed wireframes to test our design principles further. I created a click-through prototype in Figma to simulate user interactions and gather actionable feedback from stakeholders.

These screens were refined through continuous feedback, ensuring they met the users' needs effectively.

Below are three examples of the wireframes I designed.

Wireframe 1: Activity Card

I designed this wireframe to provide a comprehensive overview of lecture and faculty performance, with several key components thoughtfully chosen to support continuous improvement in teaching and assessment practices.

These choices were based on research findings, ensuring the design meets users' needs effectively and supports continuous improvement in teaching practices.

Wireframe 2: Quiz Results Screen

I wanted this wireframe to provide a comprehensive view of quiz performance data. These choices were based on research findings, ensuring the design provides actionable insights and meets faculty needs for effective assessment management.

Wireframe 3: Performance Distribution Screen

This wireframe provides a detailed breakdown of student performance by campus. These choices were informed by research indicating that faculty and administrators needed clear, visual tools to compare performance across campuses and identify areas requiring targeted intervention.

Airtable “Works-like” Prototype: Curriculum Mapping Database

In the course of our primary research, we learned the Association of American Medical Colleges (AAMC) provides objectives through the Physician Competency Reference Set (PCRS), outlining common expectations for physician training.

Currently, UICOM cannot link its curriculum to PCRS objectives because curriculum data is scattered across Excel sheets, Google Docs, and other workarounds, leading to probationary accreditation for the 2021-2022 academic year.

To address this, I compiled scattered data into a unified database in Airtable, resulting in the Comprehensive Curriculum Dashboard. This interactive tool allows high-level administrators to identify gaps in meeting PCRS objectives within the curriculum.

The prototype was instrumental in gaining more support from stakeholders for the project by clearly demonstrating the benefits of a unified data management approach.

Screens from the Airtable prototype

Phase 4: Charting the Future

Presentation to Stakeholders

In March 2022, we presented our findings and proposed a roadmap for the future.

We identified two key questions that define UICOM's program success:

  1. To what extent are we preparing students for standardized exams?

  2. To what extent are we training students to be successful practitioners?

To address these questions, we developed eight design principles, each embodied in our finalized prototypes. These included both "looks-like" prototypes in Figma and "works-like" prototypes in Airtable, such as the Comprehensive Curriculum Dashboard.

Our presentation aimed to align stakeholders on a unified vision for UICOM's data management strategy moving forward.

8 design principles presented to stakeholders

Reflections and Learnings

Working on the University of Illinois College of Medicine project was a highly educational and rewarding experience. Here are some key reflections and learnings from the project:

1. User-Centered Design is Crucial: Conducting thorough user research and stakeholder interviews was pivotal. By deeply understanding the needs and pain points of faculty, students, and administrative staff, we were able to design solutions that were genuinely useful and well-received.

2. Importance of Iterative Design: The iterative nature of our design process, involving continuous feedback and testing, was essential. It allowed us to refine our wireframes and prototypes to better meet user needs, ensuring that the final designs were both functional and user-friendly.

3. The Value of Data Visualization: Incorporating visual analytics into our designs, such as the Test Difficulty scatter plots and performance distribution histograms, provided clear and actionable insights. This reinforced the importance of visual data representation in making complex information accessible and understandable.

4. Stakeholder Engagement: Engaging stakeholders early and often, and using prototypes like the Comprehensive Curriculum Dashboard, was critical in gaining support for the project. Demonstrating tangible benefits through prototypes helped in aligning all stakeholders on a unified vision.

5. Cross-Disciplinary Collaboration: Collaborating with various stakeholders from different campuses highlighted the importance of cross-disciplinary collaboration. It enriched the design process and ensured that the solutions developed were comprehensive and addressed diverse needs.

Overall, this project reinforced the importance of empathy, adaptability, and collaboration in design. It provided invaluable lessons in creating user-centered solutions that not only meet immediate needs but also lay the foundation for future growth and improvement.