I’ve never had much luck as a matchmaker. In fact I am perfectly horrible at it. Still, this is one time when I am pretty sure that a marriage ought to take place. That’s right, I think it is time for Apple and Nuance to start getting serious about one another. Sure, Nuance now owns MacSpeech Dictate and Scribe which run on OS X. And yes, Nuance’s Dragon Dictation and Dragon Search work beautifully on the iPhone and iPad. But that’s just dating. I want to see them take the next step.
And what is that next step? I want to see Nuance’s voice recognition engine built into OS iPhone. And here’s why…
Google moved the mobile computing work forward in a big way when they integrated voice recognition system into the operating system of the Nexus One. The idea was that by using the same technology employed to transcribe voice messages on Google Voice they could bring convenience to their Android platform. With it you can use your voice to add text to any application regardless of where you were on the phone.
If you were using the Nexus One’s email application you now had a choice. You could either start typing to add text or tap the microphone icon and start speaking to add text. The same went if you were writing a document or sharing a Tweet. Because the voice to text technology was part of the OS it was available as an input option anywhere the keyboard is available. It is an AWESOME idea.
The only problem with Google’s approach is… Google’s voice transcription software isn’t very good when it comes to accuracy. That made it all but useless to me during the short time that I was a Nexus One owner. I use the transcription service provided by Google Voice despite the fact that the transcriptions are often poor. I mean, sometimes the words the person leaving the message actually said and the transcription Google Voice delivers aren’t even in the same ballpark.
and even when the transcription is left by someone speaking slowly and clearly the results are mixed.
For example, here is what Google Voice delivered when I called my number and read the first two paragraphs of this post as my “voicemail message”.
I have never had much luck as a match maker period. In fact, I am perfectly horrible at it. 38 still come. This is one time when I’m pretty sure that a marriage ought to take place. Period. That’s right, comma, I think it is time for Apple and he wants to start getting serious about one another period sure comma new wants now owns Mac speech dictate and scribes which run on O S X period. And yes, comma, Nuances track indication and rack and search working beautifully on the iPhone and I’ve at period, but that’s just dating period. I want to see them. Take the next step period and what is that next step question mark. I want to see nuances voice recognition Engineer built into O S iPhone period and here’s why. Dash.
It is pretty bad. If this were a voicemail message I would have to go and listen to the actual message to get a full understanding of it. And if this were an email it would take longer to edit it and make it intelligible than it would have taken to simply type it out in the first place. In other words, with results like this it is all but unusable.
Compare that to Dragon Dictation and the difference is clear. Here is the same two paragraphs read and transcribed by Dragon Dictation on my iPhone without a headset–
I have never had much luck as a matchmaker. In fact I am perfectly horrible at it. Still, this is one time when I am pretty sure that he marriage ought to take place. That’s right, I think it is time for Apple and Nuance to start getting serious about one another. Sure, Nuance now owns Mac speech dictate and scribe which run on OSx. And yes, nuances dragon dictation and Dragon search work beautifully on the iPhone and iPod. But that’s just dating. I want to see them take the next step. And what is the next step? I want to see if Nuance’s voice recognition engine built into OS iPhone. And here’s why —
Pretty big difference huh? Now THIS is something I would use all the time.
And I do use Dragon Dictation a great deal. The problem with it is that, thanks to Apple, it is fenced off in its own little “app world”. You can use the app to transcribe your message, but then you need to either send it to the mail or SMS app or copy it to the clipboard and then go to the app where you want the text to be placed. It is a multi-step process that is more than a bit annoying.
Now imagine if Dragon Dictation was incorporated into the entire OS of the iPhone. You start up your e-mail application to either write a new message or respond to an old message and when the keyboard comes up you can either start typing or tap the microphone button and start speaking. The same would happen if you were, for example, using Pages on the iPad. You would have the choice of typing on the keyboard or tapping the microphone button, activating the Dragon Dictation engine and start speaking.
Such an integration would make creating text on the iPhone and iPad easier than ever and would help ensure that they be seen as “productivity devices” rather than “consumption devices”. T-Mobile just did exactly this with the MyTouch 3G Slide and while I have not had a chance to try it something tells me it works a whole lot better than the voice recognition process on the Nexus One. Apple should do the same thing with the next generation of OS iPhone. It would make a world of difference.
No, I don’t expect we’ll see such a thing any time soon but if we did… I truly believe that this would be one amazing way for Apple to move the OS iPhone platform forward. Yes, a marriage between the best voice recognition technology on the market and the best mobile operating system available would be amazing. Sadly I something tells me that they are simply going to continue dating… at least for now.