University Research Portal: A DEI Initiative
Product:
The university research portal was developed to improve equity and inclusion for historically underrepresented student populations in university research labs.
Problems:
Faculty were not using the portal to select lab assistants.
Students were giving up on their pursuit of research assistantships after waiting months with no response to their applications.
The goal to improve equity and inclusion could not be achieved without faculty adoption and engagement with the research portal system.
Research Questions:
Why weren’t faculty using the platform?
What pain points were faculty users experiencing?
How could we increase faculty use without implementing a mandate?
Goals Achieved:
Insights and recommendations led to implementation of 5 major (and several minor) updates to the research portal platform.
Significantly increased voluntary user adoption and engagement, resulting in improved equity and inclusivity for historically underrepresented student populations in university research labs.
Increased response rate to applications by 3,385%.*
*That is not a typo!
Before
After
Background
Undergraduate research assistantships are scarce, highly coveted, and a major contributing factor to future professional success.
The typical process to land a research assistantship puts first-generation and traditionally under-represented college students, as well as those from low socioeconomic backgrounds, at an inherent disadvantage.
A research portal with a universal application system was developed to level the playing field for all students.
The goal to improve equity and inclusion could not be achieved without faculty adoption and engagement with the research portal system.
The Problem
-
In the first year, students submitted nearly 1,000 applications in the portal.
Less than 1% of students received a response to the application they submitted in the portal.
The goal to improve equity and inclusion could not be achieved without faculty adoption and engagement with the research portal system.
-
Conversations about this problem were focused on how to enforce a school-wide mandate to require faculty to use the portal to select their research assistants.
I suggested we approach this problem through a UX lens instead, attempting to improve the user experience in the portal by making data-driven changes to the key flows and design.
Research Questions
Why weren’t faculty using the platform?
What pain points were faculty users experiencing?
How could we increase faculty use without implementing a mandate?
Ultimate Project Goal
Create a more equitable and inclusive solution for students to get research experience through voluntary user adoption and engagement.
Project Specifications
-
After consulting with the project managers and developers, we established a timeline of 2 weeks from the initial meeting to the presentation of research insights and recommendations.
-
The research portal has 2 primary user groups:
Students
Faculty
Given the short timeline and because my research questions all focused on faculty use and pain points, I prioritized research faculty as the key user group for this project.
Within that user group, I strategically recruited participants from three sub-groups:
Those who had previously used the portal with some success
Those who had previously used the portal with no success
Those who had never used the portal
-
Due to the nature of the academic cycle, we expected to see cyclical use of the research portal, with most research faculty only having a need to use the portal once or twice per semester.*
With use cycles spread out that far, one of the biggest challenges driving the timeline for this project was the need to allow enough time for our developers to implement any changes before the next recruiting cycle began.
*Validated during project
-
Another major challenge created by the constrained timeline was that it eliminated the opportunity to iterate on the design before launch.
We had to collect the most comprehensive, insightful data possible in the shortest time possible or the entire project would very likely be dead in the water. (No pressure, right?!)
-
Unfortunately, there was no budget allocated to this project. I was able to solicit sufficient participation on short notice without offering any financial incentive.
“We have one more shot at this—if it doesn’t work, they’ll never give it another chance.”
Study Design
Getting creative with research methods to answer complex questions and work around logistical constraints is my forte! I had a lot of fun with this one!
Methods Used
-
Identify design issues: Heuristic evaluation can allow researchers to identify many design issues and can be used in conjunction with other research methods to help define important areas of focus.
Ideal for tight timeline, limited resources: Given the timeline and limited resources, a heuristic evaluation was an ideal first step for this project.
-
Need for flexibility: Unlike in a typical study, I had to take an individualized approach to each study session due to varying faculty needs and preferences.
Combined methods: I ultimately combined aspects of the following methods:
Semi-structured interviews
Unstructured interviews
Unmoderated remote interviews (via email)
Usability testing
Simulated use
Maintaining validity of the research: It was crucial for this project to have the utmost flexibility during each participant session. I have a deep understanding of best practices in behavioral research, so I was able to do this effectively and seamlessly without compromising the validity of the research.
-
Purpose: It quickly became clear that a major pain point for faculty involved receiving seemingly contradictory responses on student applications. In order to assess the root cause of that issue, I recruited student participants as well.
Logistics: I opted to use focus groups for the student participants primarily to accommodate logistical constrains.
Venue & Scheduling: I conducted focus groups in person and via MS Teams on different times of day and different days of the week.
I was limited to convenience sampling, so I varied the days, times, and mediums in order to expand my participant pool as much as possible.
Structure: I used a semi-structured format for each focus group and included several planned tasks, predetermined questions, and open-ended discussions.
Flexibility: Student focus groups allowed for more structure than the faculty sessions, but my ability to quickly pivot ultimately allowed me to incorporate methods like participatory design upon discovering confusion in areas I had not anticipated.
For example, I learned that students were confused by the way some questions were presented, leading to inconsistent responses intended to communicated the same information. Upon making this realization, I shifted our focus to a participatory design activity to solicit feedback on more intuitive ways to present those questions.
Other Methods Considered
I typically use a rule-in process where I determine the type of data needed to answer the research questions, and select the methods most suitable for collecting those data given goals and constraints specific to the project at hand.
For the purpose of this portfolio, I will also offer brief explanations of my reasoning for ruling out other methods.
-
Time constraints: Given little time to recruit a large number of participants or to create alternative designs to test, A/B and multivariate testing were not feasible options for this project.
With more time: I would have loved to compare multiple versions of the portal design, assessing various performance metrics (e.g., time on task) to further improve the user experience.
-
Insufficient breadth: It was critical for this project to collect the most comprehensive, insightful data possible. I determined that even the most thoughtfully designed survey would not provide us with the breadth of data we would need for our objectives.
Recognizing the potential for bias: Given the circumstances in this case, I would have had to make unvalidated assumptions to develop the questions for a survey that would be sufficiently broad without over-reliance on open-ended questions. Doing so has the potential to introduce unintended researcher bias.
-
Timeline: A diary study would have needed to be much longer than the allotted time for this study and would have required significantly more resources dedicated to preparation, instruction, and follow-up.
Participation & Attrition: Given that this project took place during one of the busiest times of the semester, recruiting faculty participants would have been nearly impossible and attrition rates would have been exceptionally high.
-
Not logistically feasible: The nature of academic work makes it difficult for faculty members to determine in advance when and where they will complete a particular task, especially when that task seldomly needs to be addressed. To truly approach this contextually would have required far more time and resources than feasible with this study.
Observing non-events: The primary issue that prompted this research is that faculty were not using the research portal. Designing a study that would require a researcher to observe a natural occurrence of a behavior that rarely occurs naturally is… not ideal.
Target re-launch date: We also had to consider the fact that we aimed to release this major update BEFORE faculty started the process of selecting new research assistants again.
-
Not ideal to answer these research questions: Several of these methods could have been useful in some contexts related to the research portal, but they would have contributed little to no information that could have led to actionable insights and recommendations specifically related to our research questions.
Analysis
Heuristic Evaluation
The heuristic evaluation surfaced issues with several usability heuristics. For example:
-
Issue: Terminology used in the research portal was different than that typically used by faculty.
-
Issue: Clicking “Faculty Profiles” brought up a faculty profile page, but clicking on “Student Profiles” brought up the database of student applications.
Issue: Clicking “Faculty Profiles” brought up a different page than the primary university faculty profile page. Many faculty members were not aware the secondary page existed.
-
Issue: The student profile page contained irrelevant information* presented in a non-uniform way.
*Validated during the project
-
Issue: No tutorials or troubleshooting pages existed to help users complete their tasks.
Affinity Mapping / Thematic Analysis
I identified 8 key themes from the data collected during faculty feedback sessions.
-
Very few faculty members could find the website to access the research portal, even when they wanted to use it.
-
The research portal is so rarely needed (due to the cyclical nature of the academic schedule), it was easy to forget about it.
-
Most faculty delegate a graduate student or a lab manager to select research assistants for the coming semester. Without a function to delegate this task in the portal, using it increased the workload for most faculty.
-
Users were overwhelmed by the visual clutter and lack of uniformity on the application overview page, making them more inclined to abandon the task.
This issue was identified in the heuristic evaluation and validated during testing.
Key Themes
-
The inability to keep track of applications in the portal was the most universally agreed-upon deal breaker among faculty.
-
When asked to engage in a think-aloud procedure while evaluating student applications, most faculty used the information displayed on the overview page to make their decisions, even after viewing the full applications.
Many later admitted some of those factors were not ones they typically would have considered.
-
The first thing nearly all faculty did when asked to review applications was filter by their own name so they would only be shown applications from students who indicated an interest in working with them.
Many users also requested additional and/or different filtering capabilities to better meet their needs.
-
Navigating the portal was difficult for many, often because they were not expecting to find the overview page with all student applications under the label “Student Profiles.”
This issue was identified in the heuristic evaluation and validated during testing.