Corante: technology, business, media, law, and culture news from the blogosphere
Corante Blogs

Corante Blogs examine, through the eyes of leading observers, analysts, thinkers, and doers, critical themes and memes in technology, business, law, science, and culture.

The Press Will Be Outsourced Before Stopped

Vin Crosbie, on the challenges, financial and otherwise, that newspaper publishers are facing: "The real problem, Mr. Newspaperman, isn't that your content isn't online or isn't online with multimedia. It's your content. Specifically, it's what you report, which stories you publish, and how you publish them to people, who, by the way, have very different individual interests. The problem is the content you're giving them, stupid; not the platform its on."
by Vin Crosbie in Rebuilding Media

Travels In Numerica Deserta

There's a problem in the drug industry that people have recognized for some years, but we're not that much closer to dealing with it than we were then. We keep coming up with these technologies and techniques which seem as if they might be able to help us with some of our nastiest problems - I'm talking about genomics in all its guises, and metabolic profiling, and naturally the various high-throughput screening platforms, and others. But whether these are helping or not (and opinions sure do vary), one thing that they all have in common is that they generate enormous heaps of data.
by Derek Lowe in In the Pipeline

Disrobing the Emperor: The online “user experience” isn't much of one

Now that the Web labor market is saturated and Web design a static profession, it's not surprising that 'user experience' designers and researchers who've spent their careers online are looking for new worlds to conquer. Some are returning to the “old media” as directors and producers. More are now doing offline consulting (service experience design, social policy design, exhibition design, and so on) under the 'user experience' aegis. They argue that the lessons they've learned on the Web can be applied to phenomena in the physical and social worlds. But there are enormous differences...
by Bob Jacobson in Total Experience

Second Life: What are the real numbers?

Clay Shirky, in deconstructing Second Life hype: "Second Life is heading towards two million users. Except it isn’t, really... I suspect Second Life is largely a 'Try Me' virus, where reports of a strange and wonderful new thing draw the masses to log in and try it, but whose ability to retain anything but a fraction of those users is limited. The pattern of a Try Me virus is a rapid spread of first time users, most of whom drop out quickly, with most of the dropouts becoming immune to later use."
by Clay Shirky in Many-to-Many

The democratisation of everything

Over the last few years we've seen old barriers to creativity coming down, one after the other. New technologies and services makes it trivial to publish text, whether by blog or by print-on-demand. Digital photography has democratised a previously expensive hobby. And we're seeing the barriers to movie-making crumble, with affordable high-quality cameras and video hosting provided by YouTube or Google Video and their ilk... Music making has long been easy for anyone to engage in, but technology has made high-quality recording possible without specialised equipment, and the internet has revolutionised distribution, drastically disintermediating the music industry... What's left? Software maybe? Or maybe not."
by Suw Charman in Strange Attractor

RNA Interference: Film at Eleven

Derek Lowe on the news that the Nobel Prize for medicine has gone to Craig Mello and Andrew Fire for their breakthrough work: "RNA interference is probably going to have a long climb before it starts curing many diseases, because many of those problems are even tougher than usual in its case. That doesn't take away from the discovery, though, any more than the complications of off-target effects take away from it when you talk about RNAi's research uses in cell culture. The fact that RNA interference is trickier than it first looked, in vivo or in vitro, is only to be expected. What breakthrough isn't?"
by Derek Lowe in In the Pipeline

PVP and the Honorable Enemy

Andrew Phelps: "Recently my WoW guild has been having a bit of a debate on the merits of Player-vs.-Player (PvP) within Azeroth. My personal opinion on this is that PvP has its merits, and can be incredible fun, but the system within WoW is horridly, horribly broken. It takes into account the concept of the battle, but battle without consequence, without emotive context, and most importantly, without honor..."

From later in the piece: "When I talk about this with people (thus far anyway) I typically get one of two responses, either 'yeah, right on!' or 'hey, it’s war, and war isn’t honorable – grow the hell up'. There is a lot to be said for that argument – but the problem is that war in the real historical world has very different constraints that are utterly absent from fantasized worlds..."
by Andrew Phelps in Got Game

Rats Rule, Right?

Derek Lowe: "So, you're developing a drug candidate. You've settled on what looks like a good compound - it has the activity you want in your mouse model of the disease, it's not too hard to make, and it's not toxic. Everything looks fine. Except. . .one slight problem. Although the compound has good blood levels in the mouse and in the dog, in rats it's terrible. For some reason, it just doesn't get up there. Probably some foul metabolic pathway peculiar to rats (whose innards are adapted, after all, for dealing with every kind of garbage that comes along). So, is this a problem?.."
by Derek Lowe in In the Pipeline

Really BAD customer experience at Albertsons Market

Bob Jacobson, on shopping at his local Albertsons supermarket where he had "one of the worst customer experiences" of his life: "Say what you will about the Safeway chain or the Birkenstock billionaires who charge through the roof for Whole Foods' organic fare, they know how to create shopping environments that create a more pleasurable experience, at its best (as at Whole Foods) quite enjoyable. Even the warehouses like Costco and its smaller counterpart, Smart & Final, do just fine: they have no pretentions, but neither do they dump virtual garbage on the consumer merely to create another trivial revenue stream, all for the sake of promotions in the marketing department..."
by Strange Attractor in Total Experience

The Guardian's "Comment is Free"

Kevin Anderson: "First off, I want to say that I really admire the ambition of the Guardian Unlimited’s Comment is Free. It is one of the boldest statements made by any media company that participation needs to be central to a radical revamp of traditional content strategies... It is, therfore, not hugely surprising to find that Comment is Free is having a few teething troubles..."
by Kevin Anderson in strange
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

The Loom

« Building Gab: Part One | Main | Conscientious Chimps and Bold Birds »

March 01, 2005

Building Gab: Part Two

Email This Entry

Posted by Carl Zimmer

In my last post, I traced a debate over the evolution of language. On one side, we have Steven Pinker and his colleagues, who argue that human language is, like the eye, a complex adaptation produced over millions of years through natural selection, favoring communication between hominids. On the other side, we have Noam Chomsky, Tecumseh Fitch, and Marc Hauser, who think scientists should explore some alternative ideas about language, including one hypothesis in which practically all the building blocks of human language were already in place long before our ancestors could speak, having evolved for other functions. In the current issue of Cognition, Pinker and Ray Jackendoff of Brandeis responded to Chomsky, Fitch, and Hauser with a long, detailed counterattack. They worked their way through many features of language, from words to syntax to speech, that they argued show signs of adaptation in humans specifically for language. The idea that almost of all of the language faculty was already in place is, they argue, a weak one.

Chomsky, Fitch, and Hauser have something to say in response, and their response has just been accepted by Cognition for a future issue. You can get a copy here. Chomsky, Fitch, and Hauser argue that Pinker and Jackendoff did not understand their initial paper, created a straw man in its place, and then destroyed it with arguments that are irrelevant to what Chomsky, Fitch, and Hauser actually said.

It was exactly this sort of confusion about language that Chomsky, Fitch, and Hauser believe has dogged research on its evolution. The first step to resolving this confusion, they argue, is to categorize the components of language. They suggest that scientists should focus on two categories, which they call the Faculty of Language Broad (FLB), and the Faculty of Language Narrow (FLN). FLN includes those things that are unique and essential to human language. FLB includes those things that are essential to human language but are not unique. They might be found in other animals, for example, or in other functions of the human mind.

Chomsky, Fitch, and Hauser argue that we don't actually know yet what belongs in FLN. The only way to find out is to explore the human mind and the minds of animals. But they argue that the road to an understanding of how language evolved must start here. Simply calling all of language an adaptation is a vague and fruitless statement, and one that leaves biologists and linguists unable to work together.

In their effort to portray language as a monolithic whole utterly unique to humans, Pinker and Jackendoff offer up evidence that Chomsky, Fitch, and Hauser consider beside the point. Consider the fact that the human brain shows a different response to speech than to other sounds. Chomsky, Fitch, and Hauser argue that you can't use the circuitry of the human brain as a simple guide to the evolution of its abilities. After all, some people who suffer brain injuries can lose the ability to read while retaining the ability to write. It would be silly to say that this is evidence that natural selection has altered the human brain because reading provides some reproductive advantage. Animals, Chomsky, Fitch, and Hauser argue, are a lot better at understanding the features of speech sounds than Pinker and Jackendoff give them credit for. In fact, they claim that Pinker and Jackendoff are behind the curve, relying on research that's years out of date. Given all that's been discovered about animal minds, Chomsky, Fitch, and Hauser argue that we should assume that any feature of language can be found in some animal until someone shows that it is indeed unique to humans.

There's a lot that's fascinating in all of the papers I've described in these two posts, but I find them frustrating. Pinker and Jackendoff may have erected a straw man to attack, but I think they can to some extent be forgiven. The 2002 paper by Chomsky, Fitch, and Hauser was murky, and their new paper, which is supposed to clarify it, is a bit of a maze as well. Consider the "almost-there" hypothesis, which they offered up in their 2002 paper. It's conceivable that FLN contains only one ingredient--a process called recursion, which I describe in my first post. If that's true, the evolution of recursion may have brought modern language into existence. On the one hand, Chomsky, Fitch, and Hauser claim to be noncommittal about the almost-there hypothesis, saying that we don't yet know what FLN actually is. On the other hand, they claim there is no data that refutes it. Doesn't sound very noncommittal to me.

I'm also not sure how meaningful the categories of FLB and FLN are. Consider the case of FOXP2, a gene associated with human language. Chomsky, Fitch, and Hauser point out that other animals have the gene, and that in humans its effects are not limited to language (it's important in embryo development, too). So it belongs in FLB, because it's not unique enough to qualify for FLN.

It is true that other animals have FOXP2, but in humans, it has undergone strong natural selection and is significantly different from the versions found in other animals. And just because it acts the human body in other ways doesn't mean that natural selection couldn't have favored its effect on human language. Chomsky, Fitch, and Hauser grant that features of language that belong to FLB may have also evolved significantly in humans. But if that's true, then deciding exactly what's FLN and what's not doesn't seem to have much to offer in the quest to understand the evolution of human language.

For now, the main effect these papers will have will probably be to guide scientists in different kinds of research on language. Some scientists will follow Pinker and Jackendoff, and try to reverse-engineer language. Others will focus instead on animals, and will probably find a lot of new surprises about what they're capable of. But until they come to a better agreement on what adaptations are, and the best way to study them, I don't think the debate will end any time soon.

Comments (18) + TrackBacks (0) | Category: Evolution


1. coturnix on March 1, 2005 12:06 PM writes...

Thank you. I have a knee-jerk response to go with Chomsky and Hauser and against Pinker based on the stuff they did before (and some stuff you pointed out in your two posts), but I will try to remain agnostic until I actually read all of their papers you link to.

Question: Is this debate really bimodal? Aren't there other versions? Where is Fodor on this issue? Or Terry Deacon? Cognitive ethologists?

Permalink to Comment

2. Travis on March 1, 2005 06:12 PM writes...

I'm also curious about the other options. It seems to me like there isn't enough evidence to support either side's claims.

Permalink to Comment

3. John Hardy on March 1, 2005 09:14 PM writes...

Yes maybe it's a false dichotomy. Have you managed to get a take on Juliette Blevin's recently announced research?

Permalink to Comment

4. Bob Koepp on March 2, 2005 10:34 AM writes...

Overall, a balanced presentation of a case where, surely, the devil (and the truth) is in the details.
As an epistemological aside, however, a lack of refuting/disconfirming data should not incline critical minds toward commitment to anything but suspended judgment. Chomsky and friends are rationally entitled to be noncommittal about the almost-there hypothesis.

Permalink to Comment

5. Robin Turner on March 2, 2005 11:50 AM writes...

Thanks for a clear and readable summary of current thinking. It's interesting to see how the alignments change: my initial reaction to Pinker's work was "Nice, but he's so Chomskyan!" - now we see them on opposite sides of a different fence.

Permalink to Comment

6. Quentin Crain on March 2, 2005 04:41 PM writes...

This quote:

On the one hand, Chomsky, Fitch, and Hauser claim to be noncommittal about the almost-there hypothesis, saying that we don't yet know what FLN actually is. On the other hand, they claim there is no data that refutes it. Doesn't sound very noncommittal to me.

Seems strange to me. Is this not the agnostic's position? (I do not know if there is a God. Also, I do not know of any proof that there is NOT a God. So, I take no position.)

-- Quentin

Permalink to Comment

7. manju on March 3, 2005 12:28 AM writes...

Thanks for summarising the concepts in such a simple but effective language. I must say it was really gratifying to understand the things at a first go.

Permalink to Comment

8. Robert Karls Stonjek on March 3, 2005 01:41 AM writes...

Language isn’t simply one thing, and language deficits clearly show. Those who have lost the ability to speak may still be able to swear or sing. Swearing in single words or short phrases most probably evolved directly from animal calls – indeed, the calls recruit words but are still mostly just calls (the words are meaningless, the expression and context provide the only utility).

One can lose word recognition (Aphasia) and continue to communicate in words. Losing the ability to interpret the emotional content of words seems to be a bigger defect (Agnosia).

If language as a block evolved, then numerous systems and sub systems must have evolved in sync and had some selectable utility at every step.

Homo erectus had a Broca’s area but insufficient lung control for speech (judging by the small spinal chord). This should be sufficient evidence for asymmetric evolution of language ability.

Having the ability to speak, to recognise words and their meaning is not sufficient for *communication* to proceed, as Pepperberg’s study of the African grey Parrot ‘Alex’ seems to indicate.

What are we doing when we communicate? As most of our communication is ‘small talk’ we can rule out the exchange of information as being an essential and ever present component. What is a dialogue between two or more people if they do not exchange any new information? Most communication establishes and maintains the links and connections between modules of a bigger mind, just as much of the activity of neurons is to maintain linkages to adjacent and distant others.

The first and most fundamental call/sound/language useage in children merely announces “I am here” and “I am ready for communication”. In adults we can add “I am ready to connect to the database, the shared knowledge, the shared memory, the shared vision, the shared experience, the shared method etc”.

The inverting of the function of the brain, from a standalone modular computer responding only to environment cues to a node or module in a larger cognitive structure, is a singular change, but language is only one of the results. Art, music, dance and other activities, all of which can happily exist without language, also result from this ability to flip between the two modes – ‘cognitive structure with modules’ and ‘module in a cognitive structure’.

Kind Regards,
Robert Karl Stonjek

Permalink to Comment

9. Steve Russell on March 3, 2005 01:04 PM writes...

The differential preservation of swearing is indeed interesting. There's gotta be some reaon why swearing typically consists of "repressed" material--socially disapproved references to bodily functions, inappropriate and implausible sex acts with proscribed kin, blasphemy, and the like. What do these categories of expression have in common? Why do they seemingly occupy a separate channel or pipeline? Is there something that can be learned from the "social proscription" aspect of these utterances? Why are there (at least) two separate channels or modalities for socially-approved social-interaction speech (including "small talk," which of course some people have difficulty with) and socially-disapproved speech? What could the evolutionary advantages have been for SEPARATELY promoting and conserving these different modalities?

Permalink to Comment

10. Robert Karls Stonjek on March 3, 2005 06:06 PM writes...

Swear words usually have no functional literal content in the context in which they are uttered. But all swear words, regardless of the language in which they form, all have the common feature of being the strongest words emotionally ie they evoke the strongest emotional reaction. When we swear, it is the emotional reaction that comes first, followed by a word with the nearest match for that emotional amplitude, but little else.

Words with positive connotation are also used in swearing eg ‘God’, ‘Jesus’ and ‘Heavens’, which only have in common with words like ‘shit’ the emotional amplitude they are capable of evoking.

Swearing, single words and stock phrases, does not require gramma, word recognition or the sequencing of word utterances, any of which can frustrate the language ability. It is most likely that the equivalent of swearing occurred in pre-language humans, and that the swearing mechanism merely recruits appropriate words when they become available.

You might think “now, judging by the angle of the nail, the size of the hammer head, and arc of my swing, I should be able to strike the head with sufficient force by….no, that’s not quite right. I must have miscalculated.” If you hit your finger with a hammer, you might say “dam it, I missed”, hit it harder and you may yell “dam”, even harder and you just scream, really hard and you remain momentarily silent. These are separate layers and evolved from the last mentioned to the first mentioned, and are lost in the opposite order. Further, after the most intensive response, we tend to ascend, given sufficient time, to the most moderate form.

Kind regards,
Robert Karl Stonjek

Permalink to Comment

11. Steve Russell on March 3, 2005 06:59 PM writes...

Again, RKS, interesting stuff. As you say, emotionally expressive content can certainly include simple emotionally-positive content--like cheering on the local sports team--Yipee! Yahoo! or what have you. I agree that these expressions don't require much in the way of complicated conception, grammar, etc. Linguistically, I grant you, these expressions don't seem to be much advanced beyond grunts, roars, shrieks, and cries.
But aren't there brain abberations (I'm probably not using the PC term--no intentional disrespect is intended toward anyone impacted by the disorder), like Tourette's, for example, where the "positive" emotional expressions are not unleashed in anything like an equal manner? Where it's difficult not to conclude that there is some interaction between the type of emotional expressions that are preserved (or preferentially "unleashed") and some sort of social approval/disapproval register or overlay?
If there is some sort of interplay between these untoward expressions and the brain's internalization of societal expectations, then I'm not as persuaded that the emotional expression "channel" is as clearly pre-langauge, pre-society as you seem to be suggesting.

Permalink to Comment

12. Steve Russell on March 3, 2005 08:37 PM writes...

As I think on it further, though, I suppose the "damage" in Tourette's could be to a part of the brain that has to do with appropriate social affect, rather than anything directly to do with a particular language (or amplified-emotional-expression) "channel." So perhaps that doesn't make or break either of our points.
Still a fascinating area!

Permalink to Comment

13. Robert Karls Stonjek on March 4, 2005 01:31 AM writes...

From his book ‘The Man Who Mistook His Wife for a Hat’, at the opening of chapter 10 ‘Witty Ticcy Ray’, Oliver Sacs eloquently describes Tourette’s syndrome thus:-
“In 1885 Gilles de la Tourette, a pupil of Charcot, described the astonishing syndrome which now bears his name. Tourette's syndrome', as it was immediately dubbed, is characterised by an excess of nervous energy, and a great production and extravagance of strange motions and notions: tics, jerks, mannerisms, grimaces, noises, curses, involuntary imitations and compulsions of all sorts, with an odd elfin humour and a tendency to antic and outlandish kinds of play. In its 'highest' forms, Tourette's syndrome involves every aspect of the affective, the instinctual and the imaginative life; in its 'lower', and perhaps commoner, forms, there may be little more than abnormal movements and impulsivity, though even here there is an element of strangeness. It was well recognised and extensively reported in the closing years of the last century, for these were years of a spacious neurology which did not hesitate to conjoin the organic and the psychic. It was clear to Tourette, arid his peers, that this syndrome was a sort of possession by primitive impulses and urges: but also that it was a possession with an organic basis—a very definite (if undiscovered) neurological disorder.”

Permalink to Comment

14. Eric Baum on March 6, 2005 11:25 AM writes...

I came at these issues from a different direction, presented in my book
What is Thought? (MIT Press, 2004).
Rather than start with language, I attempted to understand thought
and its evolution. Turing gave compelling arguments that whatever
is happening in the brain, can be represented as execution of a computer
program. But execution of a computer program is pure syntax? How
can it have meaning, or understanding? And how can you evolve a program
capable of solving new problems never encountered before?
Research in computational learning theory over the last 20 years
suggests the following proposal: if you find a compact enough program
that behaves well enough in a complex environment, the only way that
can happen is for the program to exploit underlying simple structure
in the environment. The only way the code will be so compact yet
so powerful is if it is modular, with modules corresponding to real
concepts, being reused in multiple computations in different ways.
Such modules and program will be so constrained (to be so compact)
that it will compute correctly how to solve new computational challenges
that it had not previously seen.

Now the obvious "compact program" here is the genome. The genome is
quite compact, smaller than the source code for Microsoft Office
when you strip out the junk. The brain is 100 million times bigger.
Complexity theory tells us that learning is a hard problem, requiring
vast computation. Yet we learn too fast. The only way this can happen
is because evolution has already done most of the work, building
into the genome inductive biases that allow us to learn automatically
and fast. You can learn meaningful things from a single presentation,
but it would be almost impossible for you to learn meaningless things.
What meaning means, is that your learning is constrained by previous

So the mind is a huge modular program, built through this Occam's razor,
on numerous modules essentially coded into the genome. What then
is language? Well, once you assign labels to modules, you can
communicate programs (or more precisely, guide a listener to
construct a program). Metaphor is a manifestation of this reuse
of modular code: you spend, borrow, waste, invest time because
you think about time reusing a module for valuable resource management.
This explains how children learn words so fast and effortlessly--
they already have the computational modules, all they are doing is
attaching labels, and the concepts are incredibly salient because
of the Occam structure, they are meaningful and the program is
highly constrained. This indicates how you can put sentences
together in infinite ways: the Occam procedure builds the modules
to be reusable, the whole point is that by finding such a compact
structure, you generalize to all kinds of new problems.

An alternate view held by some linguists seems to be that words
enable thought, that animals are incapable of thought, for example
about objects not present. I think
the evidence for this is weak and that evidence against it is strong,
e.g. introspection denies it (plenty of mathematicians proclaim
they don't think verbally, some people have been observed who lost
verbal ability temporarily or permanently through epilepsy or lesions,
yet could still reason). But moreover, I don't understand how it could
be possible. To have meaning, the words must summon the modules
implementing the computations. But then it is the computational
modules that do the actual work. It is possible that the
discovery of language was made possible by discovery of a new
method of interface among modules, but after examining this question
at some length in What is Thought? I concluded that the data seems
to be explainable without postulating such, so the principle
of simplicity mitigates against it.

This picture also can explain the divergence between human and animal
cognition solely through language as a communicative medium. Recall,
discovery of meaningful computational modules is a hard computational problem,
requiring extensive search. Animals can more or less engage in discovery
of new modules through a single lifetime. Humankind, through our
ability to guide listeners to construct programs, has discovered
over generations more powerful programming superstructure built on
top of the concepts coded in the genome. A review of many differences
in cognitive abilities finds they can all be naturally explained in
this way: for example, human theory of mind seems built in this way
on top of subroutines already present in plovers and chimpanzees.
Our more powerful TOM is built on discoveries made over generations
and communicated to children through bedtime stories and fiction
and studying Shakespeare in school.

This does raise the question of why language took so long to evolve,
if the computational structure was in place and all that was necessary
was to attach labels. This could be explained if evolution
was stuck in a potential well from which it couldn't readily
escape. A particular proposal of such a well, by Martin Nowak
and collaborators, was discovery of digital encoding (sentences made
of words, words of phonemes.) Nowak et al have argued that
until you are expressing many concepts, it is fitter to adopt an analog
encoding: you can't use lots of words till you discover digital
encoding, and you can't discover digital encoding till you use
lots of words, so evolution was stuck. What is Thought? surveys
their proposal (in my context) and also makes a second one:
you can't start using words till somebody else is prepared to
learn them, and he can't be prepared to learn them till you
are using them. It is conceivable that language was launched
by a single pair of proto-human Einsteins who conceived the plan
of naming modules and learning the names. Of course, once proto-humans
started using language, we no doubt would have evolved to use it better.
For example, What is Thought? surveys the evidence for specific
grammar adaptations as a case study of inductive bias programmed
into the genome, and even considers how the Baldwin Effect may have
been involved.

Eric Baum

Permalink to Comment

15. rufus on March 7, 2005 04:53 PM writes...

Has Chomsky ever once had a response to criticism (in linguistics or politics) that wasn't some variant of "You didn't understand what I said"?

Perhaps it should have occurred to him by now that the only thing more muddled than his thinking is his writing.

ahem, rant off.

In other news, this isn't what an agnosia is:

"Losing the ability to interpret the emotional content of words seems to be a bigger defect (Agnosia)."

Agnosias involve deficits in object perception and recognition. I'm not sure there's a concise word for deficits in emotion recognition.

Permalink to Comment

16. Robert Karls Stonjek on March 8, 2005 06:05 AM writes...

Sorry, I missed the 'Tonal' from 'Tonal Agnosia'.

'Tonal Agnosia' is exactly what I said it is - the loss of recognition of the emotional content of speech. Although the expression refers to the loss of recognition of the tonal variations in speech, which is probably what first came to the attention of medical experts, people with Tonal Agnosia are also unable to deduce the emotional content of written work.


Permalink to Comment

17. wmr on March 8, 2005 06:19 PM writes...

I am currently reading "The First Idea" by Stanley I. Greenspan and Stuart G. Shanker. Their proposed schema for the evolutionary development of concepts and language goes through the stages of learning in succession the abilities "to attend, interact with others, engage in emotional and social signaling, construct complex patterns, organize information symbolically, and use symbols to think."

This is a popular book rather than an academic work and I am not a professional in these matters. Can any of you experts out there tell me if there is any value in their ideas?

Permalink to Comment

18. rufus on March 8, 2005 07:58 PM writes...

Ah, thanks for the clarification Robert.

I'd never heard of tonal agnosia before. Interesting stuff.

Permalink to Comment


Email this entry to:

Your email address:

Message (optional):

Talking at Woods Hole
Invisible Gladiators in the Petri Dish Coliseum
Synthetic Biology--You are There
Manimals, Sticklebacks, and Finches
Jakob the Hobbit?
Grandma Manimal
Hominids for Clinical Trials--The Paper
The Neanderthal Genome Project Begins