Although ethnographic studies and contextual inquiries can provide excellent, firsthand insights into a person’s experiences with a product or within an environment, eye tracking technology unlocks an extra layer of data that is otherwise unavailable. What is a participant noticing one second to the next? What are they overlooking? What is distracting them from their task?
Eye tracking research helped answer those questions for an experience that is otherwise challenging to directly observe: a patient using an app to navigate a large hospital campus. The busy, large hospital presented additional challenges that helped prepare us for future sessions in such an environment.
Improving Appointment Arrival
The Mayo Clinic campus in Rochester, Minnesota, spans many blocks and multiple, large buildings, some with nearly 20 floors or more. For an out-of-town patient—which many who come to the Rochester campus are—the prospect of navigating this sizable campus for a personal appointment is daunting.
Where should I enter? Will there be an informational desk there? Will they have paper maps for me? Will there be a physical directory? What will the signs look like? How far will I need to walk? Do these elevators even go to the floor I need?
Designing and solving questions like these address wayfinding, or how people orient themselves in an environment and navigate to a destination. Mayo Clinic already has many architectural, informational, and people-driven details to help patients find their way through their buildings, but our research focuses on app-based navigation, a new digital solution launched in 2022.
Recently, the Mayo Clinic patient app was updated with a feature that provides step-by-step navigation through its buildings to direct patients to their appointment’s check-in desk. In addition to building out digital maps and software within the app to support this functionality, hundreds of beacon devices were installed in all patient-accessible areas on every floor of every major building. Thus, when a patient taps a button for directions to their appointment’s check-in desk, the app detects where the patient and their phone are in relation to the beacons, and then it builds a route with step-by-step directions that takes a patient from, for instance, a ground-level parking ramp/garage elevator to a specific check-in desk on the 17th floor of the Mayo Building.
Where Eye Tracking Comes In
We could have asked study participants to grab a test phone, tap the “walking directions” button, and try out this new functionality as we followed behind taking notes along the way. Yet using a smartphone, let alone orienting and directing yourself through a bustling multi-floor building, has a degree of nuance and intimacy that is sometimes hard to observe from five or eight feet away.
That is why we chose to acquire Tobii’s eye tracking glasses to provide detailed, first-person usage data of the app while being surrounded by physical and audio stimulus of all kinds. However, this was our first research project to use the glasses, so we had to learn and adapt along the way to certain challenges, which are covered in this article.
A common image that comes to mind for eye tracking is a camera on a monitor, facing towards you, that watches your eyes while they scan across items on the screen. Tobii provides multiple options for screen-based eye trackers, but their eye tracking glasses, in comparison, utilize multiple inward and outward cameras to not only show a participant’s field of view but also where, precisely, their pupils are looking within the environment around them. This includes the ability to gain to-the-second measurements of where someone’s gaze might fixate on for a moment versus items that aren’t ever glanced at.
For our study, eye tracking let us know what parts of the app’s navigational interface were being looked at, such as the map, route line, or text directions, as well as what parts of the hospital environment aided or distracted from the task at hand, such as signage, physical landmarks, or other people.
Recruiting the Right Participants
Whether for a small, local clinic or a bustling multi-building hospital campus, it was essential to ensure we had the right contacts to recruit patients and give staff a heads-up of our upcoming presence. Ideally, we wanted to secure a set of patients who had appointments on the days we would be on-site. Furthermore, we wanted the employees and managers of the areas in which we would conduct our sessions to have advance notice of the project and provide permission to work within their space.
To recruit participants for the study, we could have hung out in a waiting room or cafeteria to intercept people and ask them to participate, although there was a risk some people might be short on time, late for an appointment, or just not in an emotionally comfortable space to engage with a researcher. In our case, we needed participants to arrive well ahead of their appointment so they could interface with us, grant permission to use and track their gaze (opt-in use), put on the eye tracking glasses, and have plenty of time to use the app to navigate to their appointment’s check-in desk. They needed prior notice of what they would be doing, the incentives that would be provided, and where to meet us, so advance recruitment and scheduling were essential.
Glasses-on-Glasses, Plus Walking!
Two other wrinkles had to be considered for the research: prescription glasses and the requirement to walk around a hospital. As it turns out, although the eye tracking glasses we used were large enough to fit over a typical pair of glasses, the glasses controller application software cannot calibrate accurately with obstructions like a person’s own Rx glasses—the focal point shown in the software will jitter and bounce around, providing unreliable data. Therefore, we had to screen out anyone who had to wear prescription glasses to see. Wearing prescription contacts that weren’t colored was fine. Another solution would have been to utilize Tobii’s snap-on prescription lens kit, which unfortunately we did not have for this study.
The other atypical notice we had to provide prospective participants is that they would need to feel comfortable moving on their own around a hospital to complete the study. As it turns out, we had an additional eye tracking study we conducted during the same days that only required sitting at a table and reviewing a prototype, so those uneasy with the mobility request had an alternative.
Designing a Reliable and Consistent Exercise
As with any research project, diligent construction of a plan is paramount. We conducted both a traditional, seated research exercise in which a participant reviewed a prototype while wearing the eye tracking glasses as well as the exercise that required walking around the hospital campus while using an app. The former only required securing a small conference room with good lighting; the latter required designating a meeting place and building a variety of navigation routes starting from that location.
Since we couldn’t predict how much time a participant would need to complete a route, either due to walking speed or how busy the hospital was, we planned multiple routes to add on a la carte, depending on how much time we had left. Each route had one or two extra routes that could start from where the previous route ended, allowing plenty of flexibility. Although this meant that some participants might have experienced a little more of the navigation experience than others, all participants completed a minimum, core experience, and the additional routes traveled provided a bonus of data to confirm trends and identify any outlying issues.
Inspecting the Lighting and Acoustics
Something that stood out more than we expected was how much lighting and environmental sound could impact the session recordings. For the seated, single-room study, we had to carefully position the participant and devices so that the screens remained within the camera’s frame, no overhead lighting blocked the device’s screens with reflections, and ambient lighting didn’t result in a washed-out or overexposed device screen.
The same concerns were important for the walking exercise, too, though they were more challenging to address as participants moved from rooms with bright skylights to dimly lit hallways and fluorescent-lit waiting rooms. Participants also had to be coached and monitored to hold the test phone high enough so that the glasses’ camera consistently recorded the phone’s screen and where they were looking on that screen.
Acoustically, we learned that although the glasses’ microphone was good at recording the participant’s voice, the voice of the moderator or notetakers ended up difficult to hear, especially since all of us were wearing face masks per the hospital’s requirements. Should hearing the moderator in the recording be important, the moderator will need to focus on speaking loud enough that the glasses’ microphone picks them up.
For both lighting and sound, piloting the study several times, as expected, was essential to identify these potential issues in advance and prepare for them.
When working with technology, issues can come up mid-session, so it is always good to have a backup plan. During our sessions, we faced a few unexpected issues, including batteries running low (they typically can record for 105 minutes), software bugs within the new navigation software, and certain navigation routes not building within the app. In each case, having extras on hand helped. We had spare batteries charged and ready to swap for the glasses’ battery pack, a stakeholder nearby to troubleshoot the navigation software, an extra test phone to switch to if one failed, and alternate navigation paths should one not build. This ensured every session had at least one or more successful walking sessions.
Hospitals remain one of the most sensitive environments to conduct research in with respect to COVID-19. Working in such spaces mandates face masks as well as equipment sanitization for every session.
Participants were informed before arrival that masks would be required for the full session. If they arrived without one, we would provide one to them. The moderator can also wear a colorful or unique face mask to make themselves easier to spot when participants arrive and are looking for them.
As many who wear glasses or sunglasses know, however, face masks can fog the lenses. The eye tracking glasses were no different. Fortunately, ours had multiple nose clips in different sizes. Ensuring the participant had a face mask with a nose wire doubly reduced the chance of fogging up the lenses while wearing a face mask.
We also needed to be mindful of sanitizing the many pieces of hardware participants were expected to hold, interact with, and wear. We therefore acquired a large canister of technology-safe sanitizing wipes that we used to fully sanitize everything participants would touch—the glasses, the glasses’ battery pack, and any test phones or tablets they used during their session. This sanitizing can be done after each session, although some participants may appreciate seeing each item sanitized during their session before they handle the hardware.
Power Supplies, Batteries, and Memory Storage
It was essential to make sure we always had charged batteries for the eye tracking recording unit and kept the tablet charged to run the eye tracking software.
Beyond power, we also needed to keep an eye on the memory card which all eye tracking sessions were being saved to. This memory card sits within the glasses’ battery pack. The eye tracking management software gives real-time data on how full a card is in addition to an indicator light on the battery pack itself. Our sessions were brief enough that we were fortunate to get all of them on the same card, but if the sessions were lengthier, the data could be exported off the memory card onto a computer to clear up space between sessions.
To Follow or Not to Follow
As mentioned earlier, a tablet or smartphone is necessary to run the eye tracking management software. This device connects to the glasses via a local Wi-Fi signal, meaning that, when connected, the tablet or smartphone will not be connected to the internet for any mid-session communication or live streaming to stakeholders. Furthermore, if the participant and the glasses get too far from the device that runs the eye tracking management software, the connection will be lost including the real-time feed of what the glasses are recording and what the user is looking at.
This raises the question of what to do when someone is walking around for a study: Follow them closely or let them wander off? The glasses’ recording will continue, regardless of the initial connection being maintained, so why follow them to begin with?
Following the participant while they walked allowed us to watch a live stream of the participant’s sight to address any technical issues that came up, remind them to hold the phone higher for the glasses’ camera (in our case), and help if they got confused or the software they were testing failed. Not following a participant might have given a purer recording of how they’d act on their own without someone nearby monitoring them, but a designated meeting place and time would still have been needed to get the glasses back and stop the recording.
Whether participants should think out loud while wearing the glasses is subjective. For our study, we encouraged participants to do so, and that resulted in clearer qualitative insights on what participants were feeling about the experience versus us inferring things by watching what they did in an unnarrated video. However, as with not following the participant, not asking them to talk to themselves might have resulted in a purer recording of how they would have acted during the requested task.
For our researchers’ convenience, we uploaded recordings to a shared space soon after the sessions were completed so they could watch a session later if they couldn’t watch it live. Before uploading to a shared space, recordings needed to be exported from the memory card to a computer and saved as a .mp4 within the analysis software. The software requires robust computing—exporting a single video to .mp4 took from five to ten minutes in our experience—so it was important to have time set aside.
In addition to supporting rewatching the videos and conducting traditional, manual analysis, Tobii’s eye tracking analysis software Tobii Pro Lab can count how often and how long participants look at specific items during their session, such as a screen within the app or a major sign in the hospital. To have the software track this, we uploaded a high resolution, separate image of the item in question and specified a segment of the video in which the software should identify that image. This functionality takes significant computer resources and time.
What We Discovered
Our eye tracking research on the new patient navigation experience provided rich details. We learned that most people preferred to focus on the route drawn on the map instead of the step-by-step text directions. Many had little idea how to interpret distances, such as, “Turn left in 89 feet.” Hospital signage and physical landmarks were a frequent way to confirm patients were on the right track. We were also happy to hear that, compared to how they would typically find their check-in desk, every participant would prefer to use the app’s navigation instead.
Despite the challenges we ran into, this eye tracking research was ultimately successful, helpful, and encouraging. We confirmed our investment in the hardware was worthwhile, and we look forward to using it again for many more projects, now with more know-how on running these studies in hospitals.
Retrieved from https://uxpamagazine.org/directing-to-care-technology-to-help-patients-find-their-way/
Comments are closed.