Codebreaker

Computer teaches itself

John Moe Oct 5, 2010

Great. Juuuuust great. Have we learned nothing from Blade Runner? Battlestar Galactica? Terminator? Apparently not. A computer in a basement at Carnegie-Mellon University is able to interpret semantics, the meaning of language. It gathers up data and can associate it with other data.

From the Times:

The Never-Ending Language Learning system, or NELL, has made an impressive showing so far. NELL scans hundreds of millions of Web pages for text patterns that it uses to learn facts, 390,000 to date, with an estimated accuracy of 87 percent. These facts are grouped into semantic categories — cities, companies, sports teams, actors, universities, plants and 274 others. The category facts are things like “San Francisco is a city” and “sunflower is a plant.”
NELL also learns facts that are relations between members of two categories. For example, Peyton Manning is a football player (category). The Indianapolis Colts is a football team (category). By scanning text patterns, NELL can infer with a high probability that Peyton Manning plays for the Indianapolis Colts — even if it has never read that Mr. Manning plays for the Colts. “Plays for” is a relation, and there are 280 kinds of relations. The number of categories and relations has more than doubled since earlier this year, and will steadily expand.

“Surrender humans,” says NELL.

As a nonprofit news organization, our future depends on listeners like you who believe in the power of public service journalism.

Your investment in Marketplace helps us remain paywall-free and ensures everyone has access to trustworthy, unbiased news and information, regardless of their ability to pay.

Donate today — in any amount — to become a Marketplace Investor. Now more than ever, your commitment makes a difference.