Loup Ventures has posted their second annual Digital Assistant report and it shows just how accurate and reliable Google Assistant is when compared to its competitors. In the report, Google Assistant was able to answer the 800 questions asked of it with 86% accuracy and it understood 100% of the questions it was asked during the test. Assistant’s nearest competitor was Apple Siri which had a 78.5% accuracy rate and a 99% understanding rate.
The results reflect what a lot of people, particularly the Android faithful, felt about Google’s AI driven assistant. Now it has more concrete proof to support those subjective feelings with this report from Loup Ventures.
The test done by the research firm was thorough. It was a set of 800 questions in five major categories: Local, Commerce, Navigation, Information, and Command. The test was conducted on Amazon Alex, Apple Siri, Google Assistant, and Microsoft Cortana where each of the digital assistants were asked the same set of questions. The results showed that all of the assistants improved over the same test from April 2017 with Apple Siri and Google Assistant making the biggest improvements.
The firm points out that this is the first year for Amazon Alexa to be added to the test so it only has results for this year in the chart above. It also noted that, since Cortana and Alexa are not build into any platform like Assistant and Siri, the company used the iOS versions of those apps on an iPhone for their testing.
As you read through the report, which I strongly suggest, you’ll see that Assistant led the way in every category of question with the exception of Commands. There, Siri edged out Assistant.
No Responses