"The debate in language acquisition is around the question of how much specific information about language is hard-wired into the brain of the infant and how much of the knowledge that infants acquire about language is something that can be explained by relatively general purpose learning systems," said James McClelland, a psychology professor at Stanford University in Palo Alto, California.
McClelland says his computer program supports the theory that babies systematically sort through sounds until they understand the structure of a language.
"The problem the child confronts is how many categories are there and how should I think about it. We're trying to propose a method that solves that problem," said McClelland, whose work appears in the Proceedings of the National Academy of Sciences.
Expanding on some existing ideas, he and a team of international researchers developed a computer model that resembles the brain processes a baby uses when learning about speech.
He and colleagues tested their model by exposing it to "training sessions" that consisted of analyzing recorded speech in both English and Japanese between mothers and babies in a lab.
What they found is the computer was able to learn basic vowel sounds right along with baby.
Wednesday, July 25, 2007
But can it compute the power of love?
I'm not entirely sure I understand this article, but it seems that there were (at least) two competing theories for how an infant learns sounds: one said that it was hard-wired in the brain, and another said that the child picks it up. Well, apparently someone made a computer program modeled off a child's learning patterns, and discovered that the computer picked up vowel sounds as well as the child did, giving some credence to the latter theory.
Posted by Skemono at 8:16 PM
Labels: computers and math, language
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment