top of page
Keeping Track of Your Students
Overview & Goals
Our client RyeCatcher Education supports teachers in providing the environment all students need to succeed and reach their full potential, including those coming from at-risk backgrounds and with special needs. Developing software that connects students with teachers and parents, RyeCatcher needed a tool to allow efficient monitoring and tracking of discrete student behaviors over time.
The project goal was to develop a native tablet application that facilitates easy tracking of student and class behaviors by teachers, and aid them in making sense of the data.
Research & Use Case Scenarios
Weighing client expectations with use case scenarios provided by RyeCatcher allowed us to explore the design space and understand the scope of the project. Through stakeholder interviews with RyeCatcher CEO Arthi Krishnaswami, we understood the capabilities of the tool currently, how teachers and aides leverage those tools, and what were common user feedback and complaints concerning weak points in the digital application. Additionally, use case scenarios gave us a complete inventory of tasks a user should be able to accomplish within the app. This gave a landscape-level snapshot of all the required digital interactions the app must contain.
The next step was to map out the information architecture of the app; identifying what screens are necessary, what intermediary screens are necessary for ease of use, and logically determining the paths a user would intuitively take to reach a certain goal. The core of the app, as outlined by RyeCatcher, needed to be the behavior tracking screen. Two different paradigms of access were considered: starting with the student and viewing all of their tracked behaviors, or starting with the behavior and viewing all students who currently are tracking that behavior. Maintaining the spirit of human-centered design, the path of beginning with the student and finding their tracked behaviors was chosen, as it is the most intuitive.
Starting by treating the challenge as open-ended as possible, our group pitched a variety of ideas ranging from exploring a digital process of noting a student incident in detail, using AI and language processing to speed up documentation, and data visualizations. After brainstorming around methods of tracking student behavior, our team split the app into three major components:
1. Personal Student View
2. Class-wide View
3. Behavior Tracker View
With three main chunks to anchor our designs around, our team began mocking up drafts of both the visual and interaction design details for these screens.
Three user testing sessions were conducted with individuals who have direct experience in the classroom ranging from a substitute elementary school teacher to a high school teacher with over ten years of experience. The app's visual design, in particular the tracker screen, was the main topic of discussion - our team's approach to the behavior tracker revolved around data visualization and using color, scale, and iconography to make trackers more efficient and data more easily interpreted. This actually opened up interesting conversations about transparency of information vs privacy, and how meaning from visual symbols can be construed differently by different people. Specific insights included:
The visual size of a tracker could indicate a behavior a student is personally working on.
If negative trackers are prominent, how prominently should they be featured without risking demonizing certain students?
Icons still act as visual mnemonic devices and make tracker delineation easier - even if users are still reading the labels.
Each interview was split into two parts:
The initial free-form exploration, where the tester is able to freely peruse the app on their own while articulating aloud their thoughts and first impressions. Any functionalities they quickly found and appreciated or any they did not understand would be stated during this section.
The second is the guided exploration and performing singular tasks; the tester would be provided with a series of tasks that would encourage them to interact with all three major screens and attempt to use every major functionality. This makes it very transparent when functionalities are intuitive and obvious versus when they are difficult to find - insights on why certain digital interactions failed were extremely useful.
Users find meaning in every detail - everything from color, scale, patterns, and orientation in space of digital buttons.
Discretion versus efficiency - there is a very real give and take between making things convenient and easy to use and how much privacy users have a right to when using digital products.
Slight learning curves are a fact of life - all three of our user testers possessed varying amounts of "technological savvy", and thus some users could learn the workings of the app faster than others. Designing for interactions should not be a quest to remove all failure, but instead offer pleasant digital interactions that have strong redundancies and easy processes that allow for all users to quickly bounce back after a wrong tap or miss-click.
bottom of page