The idea that we can speak to, and write in natural language to communicate with our computers until recently has been little more than a futuristic dream. And the thought of a kind of personalized robot assistant who could turn our every whim into reality nothing but the stuff of wild imagination. But now, with the likes of Siri and Alexa on all our devices and in our homes, that dream is becoming part of our everyday life. The ability of our machines to detect the languages we speak and the words that we’re saying is becoming more sophisticated by the day. Here are some advances that are worth talking about.
Twitter is partnered with Bing to translate more than 40 languages, meaning you don’t have to go to the effort of trying to translate Tweets yourself. Though Twitter has received criticism for excluding a number of languages from its automatic language detection, meaning strange glitches such as with most African languages—that predominantly use a Latin alphabet—were recognized as Indonesian. Swahili was added to the recognition service in May 2018, and is hoped to be included in the list of languages available to use Twitter in as well. You can currently use Twitter in almost 50 languages, including a beta version of LOLCATZ if you’re so inclined.
Learning a new language? Check out our free placement test to see how your level measures up!
Facebook also automatically translates a number of languages as well as offers users the opportunity to use the platform in a multitude of languages. What is more exciting, however, is the automatic translation of messages in Facebook Messenger. At the moment this feature is only available for conversations between English and Spanish users in America and Mexico, but the hope is to roll this out to many other languages and countries as well.
Google Translate has had an automatic language detector in place for years, far out-performing even its nearest competitors—though that isn’t to say Google Translate doesn’t mess up. It is constantly improving though, through a mixture of human interaction to suggest changes and machine learning, and has recently rolled out an offline version that allows for the same quality translations that you are used to online for 59 of its languages. Microsoft did similar earlier in the year, though only for a dozen languages. Time to catch up, Microsoft!
Google Home has started to understand and respond to commands in Spanish, adding to the English, French, German, Italian, and Japanese it already knows. This outstrips Amazon, which currently supports English, German, and Japanese. Google plans to have its smart assistant understand 30 languages in total by the end of the year.
These advances in our technologies all sound pretty good so far, and more importantly, helpful. Yet having all these machines understand us is creeping some people out. Google launched its phone-calling digital concierge service Duplex in May, and many users complained immediately that the bot sounded too human. So human, that many were unable to tell that the person calling to make a reservation or an appointment was not actually a live person at all!
Duplex even uses filler language like um and ah, which added to the confusion by making the conversation sound even more natural. Google came under fire for it, accused of being deliberately misleading, since the initial trials didn’t include an announcement that it wasn’t an actual person calling.
Another recent update by Google was to remove the need to say OK, Google for every separate command made to Google Assistant. Once a user has made an initial request to activate Continued Conversation, so long as they speak within eight seconds of their last instruction, Google Assistant will continue to, well. Converse. So we really can keep talking to our machines! You can end this eight second window early by saying stop or thank you—we’re even encouraging each other not to be polite to our devices! How very human of us! And how lazy have we become as a society, that feedback complaining about having to say OK, Google repeatedly has led to this feature being created in the first place?
We can even now talk to Alexa through our iPods and iPhones if we worship at the temple of Apple. Alexa fans are already saying that she is vastly superior to Siri, though we all do have our favorites. But the question is, just how many of us lead lives so frantically busy, that we need these voice assistants available 24/7?
Change is amazing, technology advances exciting, and we can’t wait to see what comes next. Though we hope this doesn’t mean we stop talking to each other in favour of our devices—in all the languages this world has to offer!