A recent study conducted by researchers at the University of Alberta’s Actions in Complex Environments Laboratory suggests that the gestures we use on our smartphones can reveal significant insights into our decision-making processes. This understanding could lead to advancements in various fields, such as tracking patients’ recovery from injuries, assisting recruiters in selecting candidates, or enhancing the design of mobile applications.
The research indicates that “As decisions necessitate actions to create an impact on the world, measurements derived from physical movements—such as using a mouse to control a cursor—serve as dynamic indicators of decision-making.” Moreover, the authors argue that touchscreens yield more nuanced insights into indecision than traditional computers.
Craig Chapman, an associate professor involved in the study, remarked, “By meticulously observing external behavior, we can glean substantial insights into what is happening within a person’s mind.” Participants engaged with Android smartphones and tablets, completing timed tasks that involved decision-making and utilizing taps and swipes to navigate “reach-decisions,” where various selections appeared in distinct screen locations.

According to Chapman, touch devices are arguably even more effective at showcasing the movement patterns associated with decision-making since users interact with them in a more natural manner. The findings also reveal that decisions regarded as more difficult took longer to respond to, with increased movement duration and curvature of trajectories when compared to easier choices.
Chapman envisions the potential of this research to revolutionize assessments of individuals in targeted contexts. For instance, health professionals and trainers could utilize movement analytics to monitor rehabilitation progress and determine areas where additional training or support may be needed. The study also highlighted the role of movement insights in hiring evaluations, where understanding how candidates navigate indecision could influence their suitability for particular roles.
The conclusion of the study emphasizes how this data could enhance the efficiency of gathering decision-making information, identifying certain combinations that yield the most sensitivity for specific tasks. App developers might leverage these insights to better position buttons related to crucial actions—such as purchases—ultimately aiming to reduce indecision and boost conversion rates.
The complete research paper is available through the university’s publication, detailing how this work expands upon earlier studies that focused solely on mouse-and-keyboard interactions.