HCIL Team Combines Smell with Sight to Better Understand Data

Nov 29, 2018

Imagine running through a dark forest in a virtual reality video game, and being able to smell the crisp scent of pine needles all around you.

Or what if you were analyzing a complex data set, and could associate specific scents with data points in order to better track and recall the information?

These ideas are now becoming a reality through innovative research by graduate students in the University of Maryland’s Human-Computer Interaction Lab (HCIL).

Biswaksen Patnaik, a second-year master’s student in human-computer interaction, and Andrea Batch, a third-year doctoral student in information studies, are exploring ways to convey information with scent as a complement to the visual representation of data sets.

The students recently presented their paper on “information olfactation”—a term they derived by combining olfaction, which is the sense of smell, with information visualization—at a science and information visualization conference in Berlin.

“This was easily our most crazy idea to date,” says the students’ adviser and paper co-author Niklas Elmqvist, a professor of information studies in the iSchool.

Elmqvist, who also has an appointment in the University of Maryland Institute Advanced Computer Studies (UMIACS), is the current director of HCIL.

Information Olfactation: Harnessing Scent to Convey Data” lays out theories on the different ways people may perceive scent including intensity, direction and combined fragrances. The theories are based on benchmarks traditionally used in data visualization and previous literature in perceptual psychology, the researchers say.

The paper describes two prototypes the team has built—one for a desktop computer and one for a virtual reality headset—that disperse essential oils through diffusers. They also designed three different types of graph layouts in which smell can assist in conveying data visually to the user.

“You need a playground so that you can play around with producing different kinds of stimulus, which is why we made this olfactory display machine,” says Patnaik, who has a background in industrial and mechanical design, as well as digital tastes and smell technologies.

While Patnaik built the prototypes, Batch, whose research focuses in immersive environments in virtual and augmented reality, developed the software.

So far the researchers have been experimenting with ways to convey data via scent using the prototypes on themselves. They plan to recruit students for a user-study to test how effective their proposed olfactory encodings are in conveying and perceiving data with various scents.

“Compared to large scale statistical studies of survey responses, we’re going to be collecting fairly rich data in a carefully designed study,” says Batch.

One of the network visualizations designed by Patnaik and Batch consists of a Bitcoin data set. Each colored node represents an average transaction rating of a Bitcoin holder, which is also associated with a specific smell. The nodes are linked to represent transactions between users.

Elmqvist says that similar network visualization sets could be built using social networking data, such as Facebook friends.

Yet he stresses that scent is complimentary to vision, that it strengthens the connection in the observer’s mind.

“Smell is powerful but not nearly as powerful as vision, that’s why we always combine it,” he says.

There is already scientific literature that shows fragrance associated with an emotional significant event can help with information recall, says Batch. She theorizes that a molecular bouquet—a certain set of combined scents—may also improve a user’s ability to recall information.

“If I asked you, ‘how does your favorite café smell like?’ It’s not a specific fragrance. It’s not like citrus, not lavender, but it’s a combination of the smells and that’s how you still identify it,” says Patnaik, to explain the molecular bouquet concept. “This is my café; that’s how it smells like.”

A virtual reality study recently completed by researchers at the University of Maryland found that users recall information better in immersive environments. Patnaik and Batch are taking information recall in virtual reality one step further by testing their molecular bouquet theory.

They also plan to test users’ ability to detect the direction of scent, which could result in significant data considering the large role that directionality plays in an immersive environment.

While studying olfactation, all three researchers were surprised that the power of human scent is often overlooked.

“People don’t think about it; it’s almost funny,” says Elmqvist. “But there’s a reason we can smell and people for evolutionary reasons have used it to figure out danger, like good vs. dangerous food. My students really opened my eyes to look at this.”

Story by Maria Herd