Computer teaches itself
Great. Juuuuust great. Have we learned nothing from Blade Runner? Battlestar Galactica? Terminator? Apparently not. A computer in a basement at Carnegie-Mellon University is able to interpret semantics, the meaning of language. It gathers up data and can associate it with other data.
From the Times:
The Never-Ending Language Learning system, or NELL, has made an impressive showing so far. NELL scans hundreds of millions of Web pages for text patterns that it uses to learn facts, 390,000 to date, with an estimated accuracy of 87 percent. These facts are grouped into semantic categories — cities, companies, sports teams, actors, universities, plants and 274 others. The category facts are things like “San Francisco is a city” and “sunflower is a plant.”
NELL also learns facts that are relations between members of two categories. For example, Peyton Manning is a football player (category). The Indianapolis Colts is a football team (category). By scanning text patterns, NELL can infer with a high probability that Peyton Manning plays for the Indianapolis Colts — even if it has never read that Mr. Manning plays for the Colts. “Plays for” is a relation, and there are 280 kinds of relations. The number of categories and relations has more than doubled since earlier this year, and will steadily expand.
“Surrender humans,” says NELL.
We’re here to help you navigate this changed world and economy.
Our mission at Marketplace is to raise the economic intelligence of the country. It’s a tough task, but it’s never been more important.
In the past year, we’ve seen record unemployment, stimulus bills, and reddit users influencing the stock market. Marketplace helps you understand it all, will fact-based, approachable, and unbiased reporting.
Generous support from listeners and readers is what powers our nonprofit news—and your donation today will help provide this essential service. For just $5/month, you can sustain independent journalism that keeps you and thousands of others informed.