Intelligent personal assistants are conquering the world’s living rooms. Thanks to the integration into the smart home, you can now turn lights on or off, unlock your door, or send the robot vacuum on a cleaning mission using just voice commands and this extends to your car with Dragon Drive.
Voice assistants have also existed in the car for a few years now. At the beginning, some words were not fully understood by the assistant, which caused added stress and frustration while driving. But in the past few years, the technology has evolved rapidly, and now even inaccurately spoken words and complex commands are no longer a problem.
Nuance is showcasing the future of automotive assistants at IAA 2017, the International Motor Show, which will be held in Frankfurt, Germany from September 14th to September 24th, 2017. This is one of the most important automotive fairs worldwide. At New Mobility World in Hall 3.1 Booth C45, Nuance demonstrates the latest features that its Connected Car Platform Dragon Drive has to offer. For that purpose, numerous microphones have been installed in a car at the booth, allowing them to showcase a complex multi-seat speech recognition demo which is pretty impressive. The system not only recognizes and executes voice commands, but responds differently depending on who gives the command. The command “Call my mother” will ring a different person depending on whether my wife or I give the command from the passenger seat.
This technology allows a variety of use cases. The game Dragon Tunes, for example, plays a song for all passengers in the car and you have to guess the song and artist. The system can detect who said “got it” first and if your answer is correct, you earn the points. This is an entertaining way to show the capabilities of the multi-seat voice recognition.
The search for a suitable parking space is another great way of demonstrating Dragon Drive’s ability to handle complex commands with various parameters. A command such as “Find parking near Grand Circus from 3pm until 4pm that is covered parking and takes cash” is not a problem. This shows the advantage of a system that can recognize context-based connections, and does not require you to enter each parameter step-by-step, like most current systems still do.
Another new feature is Just Talk. There is no need for an activation world such as “Hello Dragon” anymore. Just issue a command and the system will automatically execute it. “Send a message to John. Hi John, I’m running late by 25 minutes.” and your message will be sent.
Controlling your Smart Home components while driving is also possible. Turning off the light in the house or closing the front door, in case you forgot. Interoperability of voice assistants is a very important point here. Apple Siri, Amazon Alexa, Google Home or Microsoft Cortana need to communicate with each other in order to support the user. At the end of the day the user just wants to say a command like “Set up a meeting at 3pm today with John and Markus” or “How are the Broncos doing?” Nuance can make the distinction between these different requests and access the relevant assistant to respond, thanks to existing programming interfaces.
Nuance is currently integrated in over 180 million vehicles worldwide and is targeting 200 million by the end of this year. The German team is mainly responsible for the automotive sector at the research labs in Aachen and Ulm. Around 500 employees are constantly working on new functions for speech recognition in the car. At the Nuance booth at IAA 2017, I was able to experience the latest version of Dragon Drive and I cannot wait for features like Just Talk to be available in my car.
In the following video you can see the latest features of Dragon Drive. I’m sure you’ll be just as excited about it as I am.