A window to the soul? Maybe, but the eyes are also a flashing neon sign for a new artificial intelligence-based system that can read them to predict what you’ll do next.
A University of Maryland researcher and two colleagues used eye-tracking technology and a new deep-learning AI algorithm to predict study participants’ choices as they viewed a comparison website with rows and columns of products and their characteristics.
The algorithm, known as RETINA (Raw Eye Tracking and Image Ncoder Architecture), could precisely zero in on selections before people had even made their decision.
“This is an area where AI technology is very effective: using data to make predictions,” said Michel Wedel, university professor emeritus and PepsiCo chair in consumer science at the Robert H. Smith School of Business. He worked with Moshe Unger of Tel Aviv University and Alexander Tuzhilin of New York University to develop RETINA. Their research is published in the journal Data mining and knowledge discovery.
Researchers who use eye movement data typically synthesize it into chunks of aggregated information, which may miss certain information and types of eye movements. Using their advanced machine learning method, Wedel and his colleagues could use the full extent of raw data from eye tracking rather than the snippets recorded by current methods.
Unusually, the algorithm is able to incorporate raw eye movement data from each eye, Wedel said.
“That’s a lot of data – several hundred thousand data points, with millions of parameters – and we use them separately for both eyes,” he said.
The algorithm could be applied in many contexts by all types of companies. For example, a retailer like Walmart could use it to enhance the virtual shopping experiences it develops in the Metaverse, a shared virtual online world. Many VR devices that people will use to explore the metaverse will have built-in eye tracking to help better render the virtual environment. Using this algorithm, Walmart could tailor the range of products displayed in its virtual store to what a person is likely to choose, based on their initial eye movements.
“Even before people have made a choice, based on their eye movements, we can tell that it is very likely that they will choose a certain product,” says Wedel. “With this knowledge, marketers could reinforce that choice or try to promote another product instead.”
RETINA has applications outside of marketing, as eye tracking becomes increasingly ubiquitous in many other fields, including medicine, psychology and psychiatry, usability and design, the arts, reading, finance , accounting – anything in which people make decisions based on some sort of visual assessment.
The tech’s biggest players, including Meta and Google, have recently acquired eye-tracking companies and are considering a range of applications. Thanks to front-facing cameras, it is now possible to track people’s eye movements from any smartphone, tablet or personal computer. Such consumer device-based approaches can’t yet be as accurate as the advanced eye-tracking hardware that researchers currently use, Wedel said, and there’s still the big problem of privacy concerns: Companies need to request privacy. permission to users.
Researchers are already working to commercialize the algorithm and expand their research to optimize decision-making.
“We think eye tracking will become available on a very large scale,” Wedel said. “Processing eye movement data is usually very laborious. With this algorithm we avoid a lot of that, so there may be many applications that we haven’t even thought of.”
More information:
Moshe Unger et al, Predicting Consumer Choice from Raw Eye Movement Data Using the RETINA Deep Learning Architecture, Data mining and knowledge discovery (2023). DOI: 10.1007/s10618-023-00989-7
Provided by University of Maryland
Quote: Researchers develop algorithm that analyzes screen users’ eye movement data (February 1, 2024) retrieved February 1, 2024 from
This document is subject to copyright. Except for fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.