Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


“Crowdsourcing Nutrition”

November 3rd, 2011 / in Research News / by Erwin Gianchandani

Crowdsourcing Nutrition in a Snap [image courtesy Harvard].Most of us have sat down to dinner and wondered just how many calories we are about to consume. Now, thanks to undergraduate researchers at Harvard University, there’s a way to do it quickly, easily, and quite reliably — all with the simple snap of a photo and the reliance of the crowd.

According to Harvard’s press release:

Americans spend upwards of $40 billion a year on dieting advice and self-help books, but the first step in any healthy eating strategy is basic awareness—what’s on the plate.

 

If keeping a food diary seems like too much effort, despair not: computer scientists at Harvard have devised a tool that lets you snap a photo of your meal and let the crowd do the rest.

 

PlateMate’s calorie estimates have proved, in tests, to be just as accurate as those of trained nutritionists, and more accurate than the user’s own logs. The research was presented at the 24th ACM Symposium on User Interface Software and Technology, a leading conference on human-computer interaction…

 

“We can take things that used to require experts and do them with crowds,” says Jon Noronha, who co-developed PlateMate as an undergraduate at Harvard and now works at Microsoft. “Estimating the nutritional value of a meal is a fairly complex task, from a computational standpoint, but with a structured workflow and some cultural awareness, we’ve expanded what crowdsourcing can achieve…”

 

Often, individuals who claim they are trying to lose weight will underestimate their caloric intake, so PlateMate’s advantage is that it allows the user to quickly consult impartial observers, without having to pay for the advice and supervision of an expert nutritionist.

 

Reproducing the accuracy of an expert in a crowd of untrained strangers, however, was not straightforward.

So how did they do it? Read more about it after the jump…

“Computer scientists normally focus on the computational aspects of a problem, but the HR issues of working with crowds can be just as challenging,” says Krzysztof Gajos, Assistant Professor of Computer Science at [Harvard] and the students’ adviser.

 

PlateMate works in coordination with Amazon Mechanical Turk, a system originally intended to help improve product listings on Amazon.com. Turkers, as the crowd workers call themselves, receive a few cents for each puzzle-like task they complete.

 

PlateMate divides nutrition analysis into several iterative tasks, asking groups of Turkers to distinguish between foods in the photo, identify what they are, and estimate quantities. The nutrition totals for the meal are then automatically calculated.

 

The researchers did encounter some common-sense problems with sending photographs to strangers without any context. A latte made with whole milk looks no different than one made with skim milk, a fast-food burger might pack in more calories than one cooked at home, and a close-up photo of a bag of chips could indicate either a sample-sized snack or a late-night binge on a bag designed for 12.

 

Early tests also identified some cultural limitations. Overseas Turkers routinely identified a burger bun with ketchup as a muffin with jam.

 

Even after restricting the tests to American workers, Noronha and [classmate Eric] Hysen discovered that portions of chicken were being characterized as “chicken feet,” again and again. The puzzling result drew their attention to another significant and common problem in crowdsourcing: worker laziness. “Chicken feet” was simply the first option in a list of chicken-related foods, so lazy Turkers were just clicking it and moving on to a new task.

 

Noronha and Hysen solved these problems by designing simple, clearly defined tasks, and algorithms that compare several answers, selecting the best one. They provided warnings about common errors, and vetted their Turkers to weed out those with a history of poor work.

 

The resulting tool is easier and more accurate than keeping a food diary and cheaper than consulting a nutritionist.

The full press release is available here. And you can learn more in a Boston Globe article.

“Crowdsourcing Nutrition”

Comments are closed.