Word Lens - augmented reality translation app - jumps platforms, is now on Android

Jul 09, 2012 by Nancy Owano report

(Phys.org) -- Tuna with hot sauce. Beach closed. Please use caution. Apple users of iOS devices, drawing envy with their cooler than cool apps, have since 2010 enjoyed Word Lens, an application that instantly provides a foreign language translation of a menu or road sign just by the user hovering the device’s camera over the foreign language content in realtime. Now Word Lens is offering its translation app for Android too. The Android app will do translations between English and Spanish, Italian, and French using just the video camera. The nice feature of the app is that network connectivity is not required.

“No network required,” reads the promotion. “Results appear immediately on your video screen when you need it, anywhere in the world.”

Word Lens uses text recognition to work out what the word or phrase is, and automatic-translation software translates it into the new language. The translation is then pasted over the original location. “An optical character recognition engine works with the in-built real time translator to translate foreign phrases for you. The converted text is then displayed on the screen of the smart phone.”

This video is not supported by your browser at this time.

Otavio Good and John DeWeese from Quest Visual are the two developers behind Word Lens. According to Good, Word Lens tries to find out what the letters are and then looks in the dictionary. Then it draws the words back on the screen in translation.

The Android is offering for purchase translations between English and Spanish, French, and Italian. The introductory price is $4.99 per language.

Those testing out Word Lens for the Android platform think it has a way to go. One site said that their brief tests of Word Lens for Android showed it worked well except for rapid flickering of words going back and forth in the process. Others described the version as having a quirky interface and being shaky and bumpy. Also, said one critique, the app does not work effortlessly in that you need to keep the hand steady and scroll over sentences patiently. Also, phrases have strangled English language word placements, such as “Tongue Bolivian” or “sauce spicy of anchovies” but as a language support tool for visitors to foreign countries it is largely considered as useful.

The developers themselves caution that the app will best work with “clearly printed text“ and they note that it does not recognize handwriting or stylized fonts.

Good acknowledges weaknesses and says he would be the first to say that the app is not perfect, “but perfect was not the goal," he added. What is useful, he said, is that “you can get the general meaning.” (Whether it is a spicy sauce or a sauce spicy, you know what is coming.) The app gets good marks elsewhere for speed and useful accuracy.

Good says that future plans include introducing more languages. He is also considering a reader for the blind, which would read out loud the words that the app sees on signs.

Explore further: SHORE facial analysis spots emotions on Google Glass

More information: play.google.com/store/apps/det… c3VhbC53b3JkbGVucyJd

Related Stories

Recommended for you

Watching others play video games is the new spectator sport

1 hour ago

As the UK's largest gaming festival, Insomnia, wrapped up its latest event on August 25, I watched a short piece of BBC Breakfast news reporting from the festival. The reporter and some of the interviewees appeared baff ...

SHORE facial analysis spots emotions on Google Glass

18 hours ago

One of the key concerns about facial recognition software has been over privacy. The very idea of having tracking mechanisms as part of an Internet-connected wearable would be likely to upset many privacy ...

Does your computer know how you're feeling?

Aug 22, 2014

Researchers in Bangladesh have designed a computer program that can accurately recognize users' emotional states as much as 87% of the time, depending on the emotion.

User comments : 0