Wednesday, October 19, 2016

Exploring Assistant, A New Voice Interface with Familiar Problems and a Bright Future

A staple of sci-fi movies and games is the inclusion of an intelligent robot that receives and executes commands with ease. As we move deeper into the 21st century, the lovable AI personalities we see on TV and in games are slowly making their way into real life in the form of our mobile assistants..

Mobile assistant technology has definitely grown in recent years, but the one mobile platform that has yet to see a true form of computer assistant is Android. Android has been at the forefront of Google search capabilities, but so far has still been missing a true take on AI assistant technology.

Earlier this summer, Google introduced its next step in mobile assistant technology, Google Assistant. The assistant was announced at Google I/Oand was showed off as the intelligent search assistant that Google has been pushing towards for years. After almost an entire summer of radio silence, the Google Assistant appeared again, this time in the form of a limited chatbot inside Google’s Messaging Service, Allo. During the #madebygoogle event on October 4, it was announced that the full version of Google Assist would only be coming to Google Home and the Google Pixel Phones. This left a sour taste in many people’s mouths, feeling that they had been blindsided by a sudden drop of support.

Thankfully, due to Android’s tinker-friendly nature, Google Assistant found its way to the Nexus 6P (temporarily), and many other devices afterwards. The assistant was acquired through a build.prop tweak that makes the device appear as a Pixel XL, and like many of these ports it has been possible thanks to LlabTooFeR. Seeing the opportunity, plenty of us at the XDA office jumped on the chance to have the new assistant on our personal smartphones.

The following findings come from our testing using an unofficial implementation of Google Assistant, a product that is at its early stages. As such, the experience might not be entirely as intended, and Google will iterate upon what we report.

The port is just a preview, with the Pixel and Pixel XL not being out yet. I was able to get it successfully working on my 2014 Moto X on CyanogenMod 14, Chief Editor Mario Serrafero got it on his OnePlus 3 running CyanogenMod 14, and Senior Editor Matthew Brack got it on his Huawei Mate 8 running their EMUI Marshmallow software through Xposed and an updated Android N-ify module.

jake_proof map_proof mario_proof

Ecstatic at our newfound ability to reach the latest in Google assistive technology, we put our respective assistants to the test to find out if they were as amazing as advertised. To start off, we asked what we could do, just to get a feel for what all the assistant was capable of. The assistant came back with 3 pages of suggestions, and all of the suggested searches worked out how you would expect them to, with a few kinks.

what_do

One such kink, was that you can only set a timer if the Google Clock app is installed. There was an easy work around, by simply asking it to remind you in X minutes to do a task. Another case of two actions achieving the same function disparately is making lists. When I queried make a list, I received a notice that only shopping lists are supported. When I asked it to make a shopping list, it happily made a list in my preferred notes app. Similar problems happened with volume controls, as “mute my device” would not return the same result as the more-specific “mute volume”.

yetGoing through and trying to dig out what it can do outside the suggested tasks was quite fun. It’s interesting, and a bit underwhelming what the search assistant is capable of doing. When trying to recreate the demo from the October 4th event, I was able to redo the entire demo, minus making a reservation. According to the assistant, that’s not working just yet. Some other features were not available yet, such as asking it to read text messages (keyword being yet).

We were able to find that Assistant can modify system settings, such as WiFi, flashlights, bluetooth, and etc. Asking it to change the temperature resulted in a few odd google search results, instead of tapping into the smart home capabilities of Google. I was also having issues attempting to get the Assistant to cast different media to my Chromecast at home.

Communication was another mishap. Google Assistant had no troubles sending Texts, calling folks, or talking through Hangouts. But when we attempted to get Assist to send an Allo message, or call someone on Duo, all that was returned was a Google search for the apps, or a link to their Play Store listing, despite the apps already being installed.

Some commands were hit or miss. Mario was able to get Assistant to pull up an orchestra performance on YouTube by only mentioning key terms rather than a title, where I was unable to. The contextual searching that would recognize that I was in Maps would not work 100% of the time, and asking how far “it” was from a different location would provide directions from my location to the different location, ignoring the dropped pin sitting in Maps. It felt inconsistent, and often times did leave me wondering why it couldn’t perform certain tasks as advertised. A gallery of screenshots from much of our testing of Google Assistant can be found at the end of the article.

Currently, it doesn’t feel like much more than Google voice search with a facelift

The way in which one reaches Assistant is very familiar to those with a history in major mobile operation systems. To reach Google Assistant, all you have to do is hold the home button and Google will start listening. This action replaces the action used to reach Screen Search (formerly Google Now On Tap), meaning that service was partially removed. I say partially because the ability to read info on your screen for possible search queries is built into Google Assistant, with an upward swipe while on the Assist screen (made intuitive by a small preview sitting at the bottom, which is a nice touch). However, not all of Screen Search’s features made it over. The two notable features that have been left behind are the ability to share one’s screen and the ability to select any text on the screen. These removals put a damper on Google Assist’s functionality, as those were services I used daily. Personally, I’m sad to see these features go, but I’m hopeful they will be added back as Google Assistant develops and expands.

Something that caught my attention was the response time in Assistant. Google seemed to vary the response time depending on what words it had picked up. In general, the end of most complete sentences/phrases would grant a second of time before Assistant stops listening to search. If I were to cut myself off in what seems like an awkward spot in a sentence, I would be given three seconds before Assistant stops listening and starts searching. Another issue with voice is the recognition, which was just as hit or miss as the the commands. We’ve written about Google’s Voice Recognition before, and our sentiments from that article are mostly the same here. When it works, it works. When it doesn’t, I find myself having to repeat a command multiple times for Assist to get it correctly.

aiAt the end of the day, the Assistant doesn’t feel like much more than regular Google voice Search with a facelift. A lot of commands we could already use in Google Voice Search on any Android phone would react the same. And in part, it’s that facelift that makes Assistant feel all the more personal and useful. The interface is structured like a conversation, and will give you suggestions as to commands that are good follow ups to results. However, this still doesn’t hide the fact that Google Assist is not as smart as it advertises itself to be by putting the heft of its advertised potential in AI. I’m still not be able to ask it “what aisle is the bread on?” in a store, and it is not on par with what we expect out of such an AI search assistant.

That said, I have found myself using Google Assistant daily since enabling it on my phone. I honestly feel that the readiness and appeal have made Google appear more useful than previously. Not all of our writers who have tested out G Assist feel the same, and some haven’t used it since they enabled it. Assist still suffers from many of the inconsistencies and issues we find in Google Now, and even the preview of Assistant in Allo. I also feel like the situation is similar to Google Now on Tap at launch: underwhelming at first, but in time it gained many features and much polish that ultimately brought it up to early expectations. As with most services, your milage will vary. This is ultimately a limited preview; the Pixel phones haven’t arrived to us, and the functionality needs to be (and hopefully will be) expanded upon.

The facelift offers a huge improvement in operation, but it has yet to improve function. Even though this still has hints of “Beta Product” all over, it looks as if Google is heading in the right direction to bring us a full fledged personal assistant, right on our phones. In summary, I feel like this is more of a different interface for what we already have, and it also lays the foundation for much of what’s to come with the Internet of Things and better service integration. It’s bound to get better over time, and it might be a smart move to have the service launch only on Pixel phones at first. It is certainly not there yet, but what we’ve seen makes it clear that this has the potential to become an integral part of our Android experience more so than both regular voice searches, Google Now and Now on Tap.

You can check out some examples below:

bored brightness clock comm hngt list mute_1 mute_2 nav restr search_1 search_2 search_3 search_4 smrt_hm sys_set trans txt

 

HostGator Web Hosting

0 comments:

Post a Comment