With student invention, not seeing is believing

Graduate student Alex Simes tests out the “SpiderSense” suit. Photo: Lance Long

The minds at the Electronic Visualization Laboratory have brought another science fiction fantasy to life with their latest invention, “SpiderSense.”

Originally named NinjaVision, the special suit allows users to sense — without seeing — obstacles around them.

“What we wanted to do is create a suit that can sense the environment and sense if someone is approaching you, and you could constantly know what’s going on around you,” said Victor Mateevitsi, doctoral student in computer science and creator of SpiderSense.

“We had never built something that big, but we had the background to do it.”

The device uses sensor modules positioned on the wearer’s body to scan the environment with ultrasound technology.

The suit delivers varying pressure feedback to the skin, giving the user a 360-degree sense of all objects within 60 feet.

As the sensors detect nearby objects, more pressure is exerted on the skin, giving the user a spatial sense of the environment without the use of vision.

SpiderSense began as a class project in human augmentics assigned by Jason Leigh, professor of computer science and EVL director.

The term, coined in a paper authored by Leigh and computer scientist Robert Kenyon, refers to technologies that expand the capabilities and characteristics of humans.

“What he [Leigh] wanted was something that’s considered invisible,” Mateevitsi said.

“How can you see the invisible? How can you sense the invisible? How can you communicate that back to the user?”

Mateevitsi and collaborators Brad Haggadone and Brian Kunzer began to brainstorm and three months later, a prototype was ready.

In tests with Haggadone wearing the suit, he walked through campus while blindfolded, successfully avoiding people and obstacles.

In another test, Haggadone, again blindfolded, threw cardboard ninja-stars at oncoming “enemies,” hitting the attacker 95 percent of the time.

“They definitely exceeded our expectations,” Mateevitsi said of the experiments.

The next step, Mateevitsi said, could be working with users who are visually impaired to develop practical applications for the device.

“If the results are encouraging, then we will probably continue. If they’re not good, then we need to think how we can improve the system,” Mateevitsi said.

SpiderSense could also aid workers in environments where conditions make it difficult to see — fire fighting, for example.

“I believe we proved that this kind of technology works,” Mateevitsi said.

“It can definitely grow, we just need more user studies and more experiments.

“We did this in three months — you could imagine if we had two years.”

Mateevitsi presented a paper on SpiderSense March 7 at Augmented Human ’13, the 4th Augmented Human International Conference in Stuttgart, Germany. He just landed a competitive 12-week internship with the Academy of Motion Picture Arts and Sciences and Technology Council. He’ll work at Pixar Animation Studios, developing studio tools.

He had a previous internship at DreamWorks, working to improve animation tools that allow animators to view their progress in real-time.

“What I like about the movie industry, especially the animated movie industry, is that you are surrounded by artists and the environment is really fun and artistic,” Mateevitsi said.

Print Friendly, PDF & Email


Research, Students


, ,