Dear Apple: Go Big with Siri and Nuance in iOS 5
(Page 2 of 2)
Nuance does it extremely well (so well the company has a market cap of more than $6 billion today). It would then present those sounds as transcriptions. But it ends there. It does not understand those transcriptions.
Siri is far better at understanding. It’s a gifted natural language AI technology that knows the myriad of ways people express intent. Siri knows that by “book” you don’t mean a paperback novel but the action to “reserve” a table. It adds a deep layer of intelligence on top of Nuance and helps turn your spoken intent into action. It can go right to OpenTable and reserve a table for you. With one verbal command, you can skip the thumb typing and avoid the three Web screens to complete your task.
It would be quintessential Apple to rewrite the rules and surprise the world—like they did with the iPad. Apple owns nearly 90 percent of the tablet market today because they dictated what a tablet should be—with its industrial design, elegant UI, treasure chest of applications and an entire supporting ecosystem.
They can also dictate what a smartphone should be. It must have deeply embedded voice recognition and artificial intelligence capabilities, a bevy of related applications, and the ability to run across the world’s telecommunications networks. (Nuance, by the way, speaks 56 languages and dialects to date.) Apple should offer APIs that let developers tap directly into Nuance’s technology to speech-enable more types of interactions, and into Siri’s technology to translate users’ voice commands into practical actions.
That’s how Apple can deliver on the promise of Siri—by leveraging speech recognition and natural-language understanding to make the mobile platform more productive and capable than a keyboard-enabled, big screen system allows. In this way, Apple can give its own developer community powerful new tools, reclaim the high ground, and make it exceedingly difficult for others to follow.