New
augmented reality system lets smartphone users get hands-on with virtual
objects
Brown University
A new software system developed by Brown University researchers turns cell phones into augmented reality portals, enabling users to place virtual building blocks, furniture and other objects into real-world backdrops, and use their hands to manipulate those objects as if they were really there.
To watch this video on YouTube: https://www.youtube.com/watch?v=ZYMjhKMpNXk
A new software system developed by Brown University researchers turns cell phones into augmented reality portals, enabling users to place virtual building blocks, furniture and other objects into real-world backdrops, and use their hands to manipulate those objects as if they were really there.
The developers hope the new system,
called Portal-ble, could be a tool for artists, designers, game developers and
others to experiment with augmented reality (AR).
The team will present the work later this month at the ACM Symposium on User Interface Software and Technology (UIST 2019) in New Orleans. The source code for Andriod is freely available for download on the researchers’ website, and iPhone code will follow soon.
The team will present the work later this month at the ACM Symposium on User Interface Software and Technology (UIST 2019) in New Orleans. The source code for Andriod is freely available for download on the researchers’ website, and iPhone code will follow soon.
“AR is going to be a great new mode
of interaction,” said Jeff Huang, an assistant professor of computer science at
Brown who developed the system with his students.
“We wanted to make something that made AR portable so that people could use anywhere without any bulky headsets. We also wanted people to be able to interact with the virtual world in a natural way using their hands.”
“We wanted to make something that made AR portable so that people could use anywhere without any bulky headsets. We also wanted people to be able to interact with the virtual world in a natural way using their hands.”
Huang said the idea for Portal-ble’s
“hands-on” interaction grew out of some frustration with AR apps like Pokemon
GO. AR apps use smartphones to place virtual objects (like Pokemon characters)
into real-world scenes, but interacting with those objects requires users to
swipe on the screen.
“Swiping just wasn’t a satisfying way of interacting,” Huang said. “In the real world, we interact with objects with our hands. We turn doorknobs, pick things up and throw things. So we thought manipulating virtual objects by hand would be much more powerful than swiping. That’s what’s different about Portal-ble.”
The platform makes use of a small
infrared sensor mounted on the back of a phone. The sensor tracks the position
of people’s hands in relation to virtual objects, enabling users to pick
objects up, turn them, stack them or drop them.
It also lets people use their hands to virtually “paint” onto real-world backdrops. As a demonstration, Huang and his students used the system to paint a virtual garden into a green space on Brown’s College Hill campus.
It also lets people use their hands to virtually “paint” onto real-world backdrops. As a demonstration, Huang and his students used the system to paint a virtual garden into a green space on Brown’s College Hill campus.
Huang says the main technical
contribution of the work was developing the right accommodations and feedback
tools to enable people to interact intuitively with virtual objects.
“It turns out that picking up a
virtual object is really hard if you try to apply real-world physics,” Huang
said. “People try to grab in the wrong place, or they put their fingers through
the objects. So we had to observe how people tried to interact with these
objects and then make our system able accommodate those tendencies.”
To do that, Huang enlisted students
in a class he was teaching to come up with tasks they might want to do in the
AR world — stacking a set of blocks, for example.
The students then asked other people to try performing those tasks using Portal-ble, while recording what people were able to do and what they couldn’t. They could then adjust the system’s physics and user interface to make interactions more successful.
The students then asked other people to try performing those tasks using Portal-ble, while recording what people were able to do and what they couldn’t. They could then adjust the system’s physics and user interface to make interactions more successful.
“It’s a little like what happens
when people draw lines in Photoshop,” Huang said. “The lines people draw are
never perfect, but the program can smooth them out and make them perfectly
straight. Those were the kinds of accommodations we were trying to make with
these virtual objects.”
The team also added sensory feedback
— visual highlights on objects and phone vibrations — to make interactions
easier. Huang said he was somewhat surprised that phone vibrations helped users
to interact.
Users feel the vibrations in the hand they’re using to hold the phone, not in the hand that’s actually grabbing for the virtual object. Still, Huang said, vibration feedback still helped users to more successfully interact with objects.
Users feel the vibrations in the hand they’re using to hold the phone, not in the hand that’s actually grabbing for the virtual object. Still, Huang said, vibration feedback still helped users to more successfully interact with objects.
In follow-up studies, users reported
that the accommodations and feedback used by the system made tasks
significantly easier, less time-consuming and more satisfying.
Huang and his students plan to
continue working with Portal-ble — expanding its object library, refining
interactions and developing new activities. They also hope to streamline the
system to make it run entirely on a phone. Currently the infrared sensor
requires an infrared sensor and external compute stick for extra processing
power.
Huang hopes people will download the
freely available source code and try it for themselves.
“We really just want to put this out
there and see what people do with it,” he said. “The code is on our website for
people to download, edit and build off of. It will be interesting to see what
people do with it.”
Co-authors on the research paper
were Jing Qian, Jiaju Ma, Xiangyu Li, Benjamin Attal, Haoming Lai, James
Tompkin and John Hughes. The work was supported by the National Science
Foundation (IIS-1552663) and by a gift from Pixar.