Excerpt:
Based on $150 million of research by the Stanford Research Institute and the Defense Advanced Research Projects Agency (DARPA), Siri melds speech input, a soothing-if-robotic synthesized female voice, natural-language processing technology, location awareness, and integration with Yelp and the Wolfram Alpha knowledge engine into something new and amazing.
You begin by holding down the Home button or simply lifting the phone to your ear, then you tell Siri how it can help you. All of the following spoken requests, and dozens more, worked perfectly for me:
"Remind me to call my wife when I get home."
"Schedule a call with Tom at 2pm"
"Wake me up at midnight tomorrow."
"Find me a Portuguese restaurant in San Francisco."
"What is 14,000 Japanese Yen in U.S. dollars?"
"When did William Howard Taft die?
Apple says Siri is in beta, but it's already remarkably clever and conversational. It understands family relationships, and if you haven't told it who your spouse or mom is, it'll ask, then remember. If you have seven Toms in your contacts, as I do, it'll list them all and ask which one you meant. It notices when you've arrived at your home or office. You can tell it "I'm hungry" or ask "Is there a God?" or "Where is Apple?" and it'll understand and say something relevant in response.
(MORE: Siri: Can Apple Sell the Concept of Natural Language Computing?)
True, Siri isn't going to beat IBM's Watson supercomputer at Jeopardy anytime soon: There are lots of things you might want it to do—such as provide spoken driving directions—that are beyond its current skills. In fact, it's missing some cool features from the original Siri app, such as movie information and flight statuses. And as I've tried it, it's sometimes failed to understand me or just plain stalled.
Still, Siri is breathtaking for a beta. If voice-activated assistants are all around us in five or ten years, we'll look back and say it all started here.
No comments:
Post a Comment