The iRobot Q&A: Colin Angle on the Next Robotics Inflection Point

(Page 2 of 5)

mapping. “Aware” software is our robot intelligence system that drives our robots. Integrating that operating system, with Visual SLAM [Simultaneous Localization and Mapping], which uses cameras as the primary sensor. Our advances in that technology started to create the opportunity to do more than tele-operations. Soldiers didn’t want to be staring at a screen getting their robot into position when somebody might be targeting them. Finally, we have practical robots that understand enough about their environment to know where they are.

That led to a very exciting technology demonstration called Ava. The architectural concept was, from the neck down, we were going to have a robot that could do navigation mapping, obstacle avoidance, and from the neck up we were going to put a tablet [computer] to tap into the mobile computing industry. Getting voice and video over IP, sexy touch screen interfaces, powerful mass-market trained development tools, speaker independent voice recognition, facial recognition, all of the stuff the mobile industry is doing, and get that “for free.” So we launched that a couple years ago.

Now we’ve just launched this [RP-VITA robot (see left)], which is the first practical application of the Ava technology, though far from the last. It’s done in partnership with InTouch Health. The mission of this robot is, we want to start at the high end. We have great confidence in our system—we wanted to demonstrate that not only could we navigate and provide a great remote presence experience, we could do it in a hospital. The good thing about a hospital is you’re leveraging a doctor’s time, which is a very valuable commodity. But the technical challenge is that you’re operating in medical environments, which are high-consequence if things go wrong. We’ve been able to create this system that will navigate in hospitals and allow this to be an avatar for a doctor and actually do medical diagnosis remotely.

X: So what’s the deeper idea here?

CA: One of our goals is to create a system—this might sound radical—that’s better than being there. What would it take for a remote doctor using this system to compensate for his lack of physically being there with information and technology? If I’m looking at you on this system, I can pull up your medical records, get real-time displays of all the systems you’re hooked up with, pull up data associated with you. I can also know who your treatment team is, and just by pushing some buttons, pull them into this robot so we can all look at you and have a conference call and discuss what should be done.

This integration of information with this high-quality system which can get itself around the hospital on a schedule—visit this doctor, this patient, that patient, monitor the patients—is really a holistic patient care system to allow more expertise with the right information to be brought to bear at the right time against an array of patients that need that care. It’s not whoever’s in the building, it’s whoever in the world is the most appropriate person or team to treat you when you need it. That’s the big idea as far as how this particular system will create value to the team. We’re really excited about this. The next concept is a more generic remote presence system that will allow video conferencing to be used more aggressively in more settings as well. So we’re just getting started, obviously, in what we think is going to be an important new venture.

X: Can you talk more about the broader impact? How will this specifically affect the quality and cost of patient care?

CA: First, what will happen in hospitals—it’s already happening—procedure by procedure, application by application, robots will be used more and more. Right now robots have some real inroads in stroke, where there are two main types of stroke. One type is treatable by t-PA [tissue plasminogen activator], but if you give it to the wrong kind of stroke, it’s detrimental—so you need a specialist to make the diagnosis, and they’re present at some number of city hospitals, but not in your local hospital. That diagnosis needs to be made within three hours of onset of stroke in order to maximally have positive effects. A robot would allow a specialist to see a patient in a rural hospital, make the diagnosis, and be a win-win for everyone. The local hospital can treat the patient, gets the admit; 30 percent of the admits would ultimately need to be transferred into the big hospital, so the big hospital gets the referral business. As you play this forward, you can improve the quality of care for the patient by allowing the right expertise with the right tools at the right time at the patient, through ubiquitous access to telemedicine tools. So that’s a huge win in patient care.

And then as you go forward, there’s an even bigger idea. How do we treat people to an increasingly significant degree in their own homes? Because one of the big problems around the cost of care is the … Next Page »

Single Page Currently on Page: 1 2 3 4 5 previous page

Gregory T. Huang is Xconomy's Deputy Editor, National IT Editor, and Editor of Xconomy Boston. E-mail him at gthuang [at] Follow @gthuang

Trending on Xconomy

By posting a comment, you agree to our terms and conditions.

  • hateroomba

    iRobot’s equipment is awful. The batteries run out in no time. A not-very-old roomba can only run for 10 minutes or so.

  • A Boston Woman in Robotics

    “In the end it’s refreshing to see that Angle, a tech celebrity and an iconic figure in Boston’s innovation scene, hasn’t lost his boyish love of robots. (None of these guys ever do.)” … or girls, and their girlish love of robots. Let’s not forget that iRobot’s cofounder was a woman who is now off building more robots.