If software is needlessly complex and tedious, almost no one is going to use it. This fundamental tenant of technology has been one of the biggest barriers for any kind of widespread embrace of calorie tracking. Researchers at MIT are looking to change things with a new voice-activated prototype for logging nutrition.
Counting calories requires either lots of dedication, or the use of intentionally simple methodology. Either you inconvenience yourself by logging what you consume in painstaking detail, or you opt to track simplified data, like noting whether you had a light, medium, or heavy meal. But Boston-based Tufts University and MIT are hoping to achieve the best of both worlds by developing language recognition software that’s tied into the USDA’s nutritional database. Simply tell the application what you ate and how much, and it automatically calculates your calorie intake. You can even verbally quantify how much you ate, or thumb through drop-down menus and type in entries manually.
Translating language into the right food categories can be difficult, considering instances where the same words can have a different meaning. For example in “oatmeal” and “oatmeal cookie,” the word “oatmeal” is both the food and the modifier of another kind of food. After overcoming those problems, the calorie counter can currently recognize over 10,000 different types of eats.
The web-based program is currently being showcased at a speech recognition conference in Shanghai, but the team says it will keep working on the system and begin testing it with people shortly. Hopefully, it will help break through the tedium barrier that keeps accurate calorie counting from helping millions of people stay healthy.
[Source:- Gizmodo]