Google's Pixel phones are the first official devices with Google Assistant, its machine learning AI and assistant, built directly into it. The company wants Google Assistant to be "your own personal Google," performing basic tasks for a user, carrying on conversations, and performing search queries. YouTuber Marques Brownlee decided to test the new Google Assistant by putting it into a head-to-head competition with the latest version of Siri, running on an iPhone 7 Plus.
In the video, Brownlee puts both devices side-by-side and activates them at the same time. He gives them a series of queries and commands, testing how each one responds. Brownlee starts with simpler tasks, like checking the weather, completing math equations, opening apps, and setting timers. He then moves into slightly more advanced queries, like asking what time the post office closes, Tesla's stock prices, and who the President of the United States is.
After figuring out who the president is, Brownlee asked both assistants how tall he is. Google Assistant was able to figure out who President Barack Obama was and how tall he is, while Siri had to resort to a Bing search. When asked how tall Obama was, Siri stumbled, searching for how tall the United States is. Brownlee soon asks how tall Obama is to both assistants more directly, and both answer the query with no problems.
Next, he once again tries to carry on a contextual conversation with the assistants. While Google Assistant could identify who won the Super Bowl and the team's current quarterback, Siri struggled to connect the two queries. However, when asked whether the Los Angeles Clippers won, Siri listed the most recent Clippers game, a preseason game, while Google Assistant listed a game from last season. Google Assistant also stumbled while trying to identify a song that was playing.
Brownlee then asks a series of contextual questions about the city of Kathmandu and Facebook CEO Mark Zuckerberg. Siri was able to keep up with Google Assistant, understanding context and linking together multiple queries, despite initially hearing "Zuckerberg" as "Sucker berg."
The video concludes with Brownlee trying to talk to both Siri and Google Assistant, asking how they are and whether they can make him a sandwich, tell him a joke, and more. Surprisingly, Google Assistant was more personable, joking around with Brownlee.
Brownlee found that Google Assistant felt more personable than Siri, understanding context better and able to joke around a little more. However, he felt that Siri provided him more information, using graphs and visual information. Siri, he said, presented him the info he needed and got out of the way, while Google Assistant was more talkative, reading the information out to him rather than relying on visual cues.
Apple has continually been improving Siri over the years, adding new functions and opening it up to third-party applications. In August, Eddy Cue, Craig Federighi and Phil Schiller explained that new machine learning techniques had cut Siri's error rate by half. Recent rumors have suggested Apple is aiming to improve Siri's functionality further for inclusion in an Amazon Echo-like smart home device. Apple has been hiring employees for its machine learning and AI divisions and acquired machine learning startups Turi and Tuplejump in recent months in an effort to improve Siri.
Top Rated Comments
He also persisted on asking a contextual follow up question when Siri clearly missed the original question. Of course she's not going to know the follow up. For the record, she understood the same exact question when I asked "How tall is Barack Obama" and unlimited follow ups like "where was he born", "how old is he", etc etc etc... Siri also understood "Mark Zuckerberg" the first time I asked.
I was impressed that Siri was consistently faster at answering questions than Google Assistant and I appreciated the presentation of the answers in a well designed card instead of the small busy copy in Google's responses.
Finally, another one of his criticisms was that Siri presents visual information rather than read it out. Again, his ignorance of iOS is showing. Siri will read information to you when she thinks that you're doing an activity where you're not looking at the screen. For example, when summoning Siri with "Hey Siri", she will read you the answer. If you call her up by pressing the headphone mic button, she'll present information audibly. But when you're pressing the home button to bring up Siri, she correctly assumes that you have the iPhone in front of you and gives you visual information.
Either Brownlee is showing favouritism in his comparisons or he needs to learn more about iOS if he wants to remain credible.
When you hold the home button and ask something, Siri will often only display results, as it assumes that you are holding your device in your hand and are looking at it. If you say "Hey Siri", with the Display turned off before, Siri will read out more of what it has found, as it assumes, you are not looking at the screen.
I wish we could fire siri and use Google assistant or google now! When I use google now on my friends phone, it just seems to work so much better and understand me a lot better.
I guess i'll go back to dreaming that maybe one day Siri and Tim will go!