When cultures change and new cultural tasks give rise to new demands for cognitive competence, human plasticity makes it possible for the new outcomes to be reached. -- JOHN U. OGBU [1]
Technology is here to stay. We have to be damn sure we do it right -- whatever "right" means. Therein lies the vision -- and the challenge. -- GARY PETERSON, SUPERINTENDENT, LEARNERS' MODEL TECHNOLOGY PROJECT, CA
In a large classroom, groups of teachers cluster around computer monitors. Their charged intensity belies the summer heat that presses against the air-conditioned building, a contemporary anachronism on a quiet, white-pillared campus whose traditions reach back well over a century. But no one is gazing out the window at the green lawns, white clapboard buildings, and gracious, overarching trees. As their instructor walks to the center of the room, some remain engrossed; others look up with an expression that can best be described as dazed.
"Well," he says. "You came to this workshop to learn the newest methods for teaching math, and I've just shown you a forty-five-dollar computer program that can do all the operations of algebra, trig, and calculus. This afternoon I will demonstrate a pocket calculator that will soon be available which can do graphing and geometry. Many of you spend up to eighty percent of your class time teaching kids to do these calculations that a simple program can now perform almost instantly. So, I've only got one question. What do you plan to do for the rest of your life?"
"Retire!" says one man, obviously eager to head back to his green-shuttered dormitory.
"Wait! This is exciting!" exclaims another. "Think of the problems we'll be able to work on. We'll have to teach the kids to understand the questions. Even if the machines know how, somebody's going to have to know why. Students can't plug in the right data and know what operations to use unless they understand the problem."
As the group adjourns for lunch, I approach the leader, Lew Romagnano, to thank him for allowing me to sit in on this impressive demonstration.
"What sort of impact do you think computers will have on the human brain?" I ask him.
"Who knows. You're the brain person, not me! Probably brains will get lots bigger because we won't have all this computation nonsense to worry about anymore. Seriously, you're talking about real mathematical thinking -- patterns you can see -- without doing hours of arithmetic. If we didn't have to teach long division for six months in the fifth grade, think what else we could teach -- probability, statistics, geometry, mathematical reasoning. It's sure to have some sort of effect on the brain."
MINDS IN AN "INFORMATION AGE"
As I have worked on this book, my file optimistically labeled "Future Minds" has overflowed and been expanded until it has finally assumed book-length proportions of its own. I search it to discover what may happen to a human brain that takes on machines as intellectual boonfellows, but I don't find any answers. Even the dimensions of the question, in fact, aren't totally clear. The first is doubtless what new demands will be placed on the human mind as a function of the "information age."
With a proliferation of new technology, occupational demands on the human brain are shifting from direct manipulation of the physical universe (e.g., putting parts together on an assembly line, driving a tractor, going to a library to look up research articles, mixing chemicals in a lab, making change from a cash register) to managing machines that perform these functions. The machines, in turn, churn forth and instantly transmit inhuman quantities of data. The amount of available information is now estimated to double every two years -- an astounding harbinger of future possibilities, but an alarming reminder that we now need machines to manage our knowledge as well as our commerce.
It is estimated that 40% of new investment in plant and equipment is for electronic data-shufflers. A proliferation of computers, video, telecommunications, copying and FAX machines, and various permutations among them, encapsulate and speed the pace of human discourse.
These changes inevitably cause fundamental shifts in mental activity. Machines become extensions of our brains. Thinking is referred to as "information processing"; working requires more and more ability to access, manipulate, and use data. The worker of the future, we are told, must be prepared to act as an individual manager of both the information and the technological tools by which it is assembled: computer memory banks and data bases, electronic libraries, video encyclopedias, etc. Meanwhile, with instantaneous transmission of written as well as oral communication all over the world, the human "patience curve" wavers perceptibly.
But someone has to "see the patterns," figure out the purpose and the plan for this frenetic fact-factory. One might also hope that people will retain enough control to reflect on where it is all taking us -- and why.
Subtle shifts in what the human brain is required to do will eventually cause it to modify itself for new uses, at least in those who are either young or sufficiently motivated. Speculations naturally abound as to what these effects may be, but if I restricted this chapter to what has been proven about technology's ultimate impact on brains, it would end right here.
Nevertheless, since these electronic developers are lining up to stake out a claim in the brains of today's children, I believe we should try to figure out a few more questions to ask before we sign the contract. We have already witnessed clear changes in children's habits of mind: declining verbal skills, changing patterns of attention, a less reflective approach to problem-solving. How might they fit with our conjectures about the future? Are human brains about to get caught in the experiential fragmentation of machine technology, or will they gain broader abilities to stand back and understand what is happening?
EVOLVING BRAINS?
One of the questions I often get after presenting the ideas set forth in this book is whether the changes so consistently observed in students may represent some sort of evolutionary trend. Is it possible that print literacy and/or the process of extended mental reflection are merely evolutionary way stations for a species en route to bigger and better things? As we saw in Chapter 3, neuroscientists have proposed that the inner workings of the brain itself adapt themselves to new environments through a Darwinian model of competitive selection.
Scientists agree that generational changes in cognitive abilities are probably part of an evolutionary process. Dr. Steven Jay Gould, noted evolutionary biologist and authority on Darwinian theory, believes such changes are primarily associated with a dynamic process of "cultural evolution." Gould believes that genetic changes, in the strict Darwinian sense, take far too long to be so readily noticed, although they, too, are doubtless occurring over the long march of human mental development.
Most geneticists, of course, do not believe that simply using the organs of one's body differently can cause heritable changes in the underlying genes. If some motor neurons in a monkey's brain wither because he lost the use of two fingers, his offspring will not be born with either the fingers or the neurons missing.
For humans, however, so-called "inheritance" of intellectual traits and habits is possible, because it happens differently, says Gould. Even Darwin believed that "cultural evolution," which occurs only in human societies, causes changes in knowledge and behavior that can then be transmitted across the generations. As Gould explains it,
Human uniqueness resides primarily in our brains. It is expressed in the culture built upon our intelligence and the power it gives us to manipulate the world. Cultural evolution can proceed so quickly because it operates, as biological evolution does not, in the "Lamarckian" mode -- by the inheritance of acquired characters. Whatever one generation learns it can pass on to the next by writing, instruction, inculcation, ritual, tradition, and a host of methods that humans have developed to assure continuity in culture. [2]
Cultural evolution is not only rapid, he says, but also readily reversible from generation to generation because it is not coded in the genes. Other scientists agree that human gray matter is "capable of meeting widely varying cultural assumptions" and thus may change rather rapidly. Each generation of human brains seems to have the potential to develop new types of neural networks or find new combinations for old ones that haven't been fully tapped.
Another expert told me he explains the mental flexibility of our species as somewhat analogous to a pitcher of martinis at a cocktail party. The same (genetic) ingredients are always there -- gin and vermouth -- but over the course of the evening the hostess may add more of one or the other and the mixture will change slightly, although it's still a martini. The genetic basis of the human brain may be similarly constant, but its ingredients can get mixed and matched differently during the process of adaptation.
One reason inherited forms of intelligence or behavior may shift, say some scientists, is that genes can be either turned on or turned off to varying degrees by environmental demand. As a species, we have talents we probably haven't even used yet. According to Gould, human brains are "enormously complex computers" that can perform a wide variety of tasks in addition to the ones they first evolved to perform:
I do not doubt that natural selection acted in building our oversized brains -- and I am equally confident that our brains became large as an adaptation for definite roles. . . . [These complex brain] computers were built for reasons, but possess an almost terrifying array of additional capacities. [3]
Gould adds, incidentally, that evolutionary design can degenerate as well as improve. [4] Apparently, as another authority opined, our current state represents "not a package of perfection, but a package of compromises." [5] Will we continue to "improve"? By what standards can we judge?
Dr. Jerome Bruner offered a thoughtful commentary to my questions about changing brains in a technological age. "The only thing I can say with some degree of certainty," he wrote, "is that the evolution of human brain function has changed principally in response to the linkage between human beings and different tool systems. It would seem as if technology and its development leads to a new basis of selection. . . surely there must be a variety of changes in progress that resulted from writing systems, even though writing systems were introduced only a short time ago as far as we reckon evolutionary time. And now, of course, we have computers and video systems, and how long before the selection pattern changes as a result of these?"
But, he advised, we should first worry about more practical issues. "The fact of the matter is that we need a much broader distribution of high skills to run this culture than ever was needed before, and the failure to produce that distribution has been the cause of serious alienation. If we produce a two-tier society, it means in effect that we have two separate sets of evolutionary pressures operating -- one within the elite group that calls for an acceleration of ability, and one within an underclass where no such pressure operates.
"See what you can make of that," he concluded. [6]
What kinds of intelligence will be most likely to produce these new forms of "high skills"? That must be the next question.
NEW INTELLIGENCES?
The cognitive skills required by the new computer technology require precise definitions, linear thinking, precise rules and algorithms for thinking and acting. -- Committee on Correspondence on the Future of Public Education [7]
We're going to have to get out of this linear model of thinking. I suppose major change is the only way we are going to break loose from the formal mind and become general systems thinkers in time for species preservation to occur. We've pretty much, for the time being, exhausted the scientific method. We've objectified life about as far as it can be objectified -- and it hasn't worked. You can only go so far with the right leg, now it's time to move the left leg forward for a while. -- Dr. Dee Coulter, Naropa Institute
Obviously, no agreement exists on the nature of the "new intelligences." Many claim that mental abilities for the future must include widened perspectives, a broader range of mental skills, and a great deal of open-ended imagination to come up with solutions to the world's big problems. On the other hand, some believe we should adapt our human mentalities more closely to the precision of the machines.
One issue concerns the kinds of intelligence we should encourage in children who will live in a world where machines can do most of the mental scut work. What should we be teaching if the human brain will soon be relieved of the responsibility for doing arithmetic problems, spelling accurately, writing by hand, and memorizing data? At some time in the not-too-distant future, every student -- at least in districts where funding is available -- may work at a computer station where all these operations will be performed by a machine. Computerized data bases will instantly access any type of information, sort and summarize it. Word processing programs, perhaps with the aid of spelling, grammar, and punctuation checkers, and outlining programs designed to help the writer organize ideas, will enable rapid note-taking and report writing.
At some point, this equipment may become pocket-sized -- a portable, permanent adjunct to the brain's memory systems. What will be important to learn then? Probably not the names and dates of the kings of England or the formula for the area of a parallelogram.
Glimpses of Electronic Learning
Some of the applications already available or on the drawing boards open astonishing windows onto future learning. If a student wants to learn about the French Revolution, for instance, here is a not-so-imaginary scenario: A program will project on her monitor screen a written and/or narrated summary of facts and events, lists and/or abstracts of relevant historical research, an animated time line of key events with a visual enactment of important scenes, set to the music of the period. She may choose to drill herself on the words of the "La Marseillaise" or some French verb tenses, or she may choose a program that lets her wander through the Louvre, browsing among relevant paintings. She might participate in a mock interview with Marat or visit the prisoners in the Bastille -- in French with English translation, or vice versa. She may then choose to perfect her French vocabulary and spelling by playing a game; each time she gets an answer correct, she saves one aristocrat from the guillotine. She will then visit a French street market to use the words she has just learned in a conversation on interactive video that will also check out her accent and idioms (computers that can accurately hear and "understand" children's voices are not yet available, but there is every reason to believe they will be before too long). Or she may boot up a "simulation" in which she assumes the role of a leader on either side of the dispute, sits in on planning sessions where she makes decisions about key turning points in the Revolution, and then learns the historical consequences of her choices.
These activities, prototypes for most of which are already available, assuredly understate the possibilities of the next decade. Defining the "basics" that children will still need to master in such a world will get you a good argument among any group of educators. Maximizing the effectiveness of such technology may require well-reasoned reconsideration of some long-cherished ideas about who teaches what to whom, when, and how.
Technology will enable radical changes in teaching formats. Whether or not children will still need classrooms -- or even human teachers -- in the new age of instant communication is also a nice discussion-starter. With equipment developed by IBM, students even now can sit at home -- or in different parts of the country (world?) -- with computerized video monitors through which they communicate instantaneously with classmates and instructor. The teacher can ask a question and see an immediate tally on his screen of every student's response, so he knows immediately who is understanding and who is not. Of course, such questions tend, at least so far, to be of the multiple-choice variety. Will we still need oral language when we spend most of our time on keyboards or pushing buttons? What new sorts of perceptual or mental skills will be required? And what will happen to some of the old ones -- not the least of which is interpersonal/emotional development -- as the brain devotes its time and connectivity to different challenges?
Forward to the "Basics": What Will They Be?
The computer age may also promote different types of learning abilities than the ones traditionally valued and rewarded. Facility for memorization, spelling, or good handwriting may not seem all that important anymore. Some people believe these basic disciplines should still be stressed because they build up children's brains for other types of thinking, but psychologists are unsure about the generalizability of specific types of "mental exercise." It may be better, they say, to work on general reasoning ability so the child will be able to learn all types of new skills, since many -- perhaps most -- of the occupations they may eventually pursue haven't even been invented yet! Children clearly need to be taught habits of mental self-discipline, but no one has clearly established the best way to do so.
Will children still need oral language skills? Very likely, both for personal communication and as a foundation for reading and writing -- even if it is connected with a computer screen. A recent government report entitled "Technology and the American Transition" acknowledged that all workers will need more mental flexibility than has previously been the case. Yet the "protean" mentality that will prosper in the new work force must still possess sophisticated verbal skills. "The talents needed are not clever hands or a strong back," the report concludes, "but rather the ability to understand instructions and poorly written manuals, ask questions, assimilate unfamiliar information and work with unfamiliar teams." [8]
Overall, most thoughtful people who have considered the skills that will be needed -- and reinforced -- in brains of the future agree that higher-level abilities will be required from everyone. Yet, according to Priscilla Vail, common definitions of what constitutes "higher-level" skills may also change. She points out that the educated person used to be one who could find information; now, with a flood of data available, the educated mind is not the one that can master the facts, but the one able to ask the "winnowing question."
"The ones who have kept alive their ability to play with patterns, to experiment -- they will be the ones who can make use of what technology has to offer. Those whose focus has been on getting the correct answers to get a high score will be obsolete!" [9]
Dr. Howard Gardner has reminded us that intelligence usually gets defined in terms of which individuals can solve the problems or create the products that are valued in the culture at any given time. Brain systems for different types of intelligence are relatively discrete; improving one will not necessarily improve others (e. g., playing video games will not make children faster readers; learning the organization needed to write computer programs will probably not improve their skills in cleaning up their rooms). Moreover, when time and practice are devoted to one set of skills, space for others may be preempted. It appears as if minds that will be most valued in the future will need to have a remarkable combination of "big picture" reasoning and analytic acuity. They will be able to "see" patterns, but also communicate and interpret language accurately. Yet some believe that these two types of abilities are fundamentally at odds with each other.
DUAL ABILITIES IN THE UNIFIED MIND
It is quite possible that linear thinking, as opposed to imagery thinking, has been one of our handicaps in trying to solve [many of our] pressing worldwide problems. The mode of thinking we need ... must help us to visualize the connections among all parts of the problem. This is where imagery is a powerful thinking tool, as it has been for scientists, including Einstein. -- Mary Alice White, Teachers College, Columbia [10]
In general the competent uses of data bases requires a careful, rather than a sloppy understanding of ... words. We need to educate people to use the language with much greater precision than they are presently accustomed to using. -- Judah L. Schwartz, MIT [11]
Visual Literacy
A sixth-grade student nervously walks to the front of the classroom to present his research report on different types of aircraft. Inserting a video cassette into a monitor, he presses a button and the presentation begins. A series of film clips illustrates aviation scenes. As each type of plane is shown, the student reads a brief sentence introducing it, then remains silent as his classmates watch the remainder of the clip. As the video ends, a plane explodes in midair. The audience cheers. The teacher compliments the "author" on his creativity.
This "demonstration lesson" of uses of video in the classroom elicits a mixed response from school principals invited to view it. Some are delighted. "The boy showed a lot of imagination." "Endless possibilities." "Look how intent those kids were . . . they rarely listen that well!"
Others are more skeptical, particularly about the absence of extended narrative. The pictures, indeed, tell the story, but what happened to reading, writing, and reasoning? The rapt attention of the child's classmates is questioned. Is their response to the screen merely conditioned -- but uncritical? Is this the shadow of the future? Should we be worried?
Excerpts from a "video encyclopedia" are shown. In one "entry" a contemporary demagogue is seen delivering a segment of an emotionally charged oration. This man is a persuader and his delivery capitalizes on body language; his views are also controversial. But no analysis accompanies this "entry"; encyclopedias are, after all, compilations of fact. This film is an accurate record of what occurred -- but is it "fact"? Who can guarantee students access to opposing views? Who will show them how to ask the winnowing questions?
Video is persuasive. For immature viewers -- and perhaps for mature ones as well -- it pulls on emotions and evokes mood more readily than does print. Visual media are often accused of being more subjective. Their immediacy may bias against thoughtful analysis, at least for people untrained in critical viewing. A series of images may also tell a more fragmented story than the linked ideas that follow each other in a text. Certain types of visual information (e.g., television) may require less effortful processing than print media. Yet visual media are effective conveyers of some aspects of experience. Seeing film clips from a war can amplify and add perspective to reading about it in a history book. Visual images encourage intuitive response. Video presentations also have unlimited boundaries of time and space; they are free from the narrative chronology of text. Moreover, most brains tend to retain colorful visual images more readily than what they have heard or seen in print.
The growing question, of course, is whether so-called "visual literacies" could replace print. Will instruction manuals of the future rely on pictures and diagrams instead of words? Will holistic/emotional responses blot out more precise verbal/analytic forms of reasoning? Might human reasoning actually rise to higher levels if we were unencumbered by the constraints of syntax and paragraph structure? Are we on the cusp of a major alteration in the way the human brain processes information? After all, human beings have been receiving information from visual and interpersonal communication for over ten thousand years; they have only been getting it from readily available print during the last five hundred.
Thought Without Language
Should we regard rock videos replacing Shakespeare as an evolutionary advance? Does language place artificial constraints on ideas that might be liberated by nonverbal reasoning? Is thought possible without any sort of symbol system? In The Dancing Wu Li Masters, Gary Zukav explains how he thinks reality gets fragmented by the use of symbols -- particularly words. As an example he uses happiness, a global state of being that cannot fairly be boiled down to a symbol. Pinning a word onto this indescribable state changes it to an abstraction, a concept, rather than a real experience. "Symbols and experience do not follow the same rules," states Zukav. "Undifferentiated reality is inexpressible." The goal of "pure awareness" sought by Eastern religions is presumably an example of transcending the need to distort understanding by trying to communicate it.
Zukav's main point is that holistic approaches to reality, which he relates to the right hemisphere of the brain, more accurately represent the principles of our physical world, exemplified in physics and mathematics. Their reality, he claims, is actually distorted by forcing them into symbols. Although he does not solve the problem of how to communicate ideas "which the poetic intuition may apprehend, but which the intellect can never fully grasp," he recommends broadening our outlook into the "higher dimensions of human experience." [12]
So-called "nonverbal thought," freed from the constraints of language, is a recognized vehicle for artists, musicians, inventors, engineers, mathematicians, and athletes. [13] Nonverbal thought is not always a poetic and undifferentiated whole, but can also relate to much more mundane matters and proceed sequentially (e.g., picturing the steps in assembling a machine or turning it over in one's mind and examining the parts or mentally rehearsing the sequence of body movements in a tennis serve). Much important experience can't be reduced to verbal descriptions. Yet in schools, traditionally, the senses have had little status after kindergarten.
"Even in engineering school, a course in 'visual thinking' is considered an aberration," says one critic who believes that too much emphasis on verbal learning places conceptual limits on inventiveness. By neglecting such studies as mechanical drawing for all students, he insists, we are cutting out a big portion of an important, and valid, form of reasoning. [14]
Can computers guide people in nonverbal reasoning? Dr. Ralph Grubb of IBM is an enthusiastic advocate of this idea. Computerized simulations of math, engineering, architectural, and scientific problems will help us get away from our "tyranny of text" and move into more visual thinking, he claims. For example, computers can now produce three-dimensional models of scientific data, graphs or representations that can enable a manager to "see" all the aspects of a complex financial situation, or simulations that allow an architect to take a visual "walk" through a building she is designing. Although, to the uninitiated, some of these simulations are totally baffling, they are doubtless the mode through which much information will be represented in the future. "Visual metaphors will strip away needless complexity and get right down to the idea," he said. "Flexibility is the key -- you have to be able to shift between perspectives." [15]
When I was talking with Dr. Grubb, however, I noticed that all his examples involved mathematical, mechanical, or artistic fields. Can nonverbal metaphors also mediate the study of history? Is body language a good criterion for judging a political candidate? Perhaps we should make sure the "tyranny of text" gets supplemented rather than replaced.
Some thought certainly needs to move beyond (or remain before) words. Most people who have studied this question, however, insist that written language and the symbol systems (e.g., mathematics) should remain an important vehicle for organizing, thinking abstractly, reasoning about future as well as present, and communicating some types of information more precisely. While mathematical ideas may best be apprehended holistically, the process of thinking through a problem in a step-by-step sequence to get it down on paper confers additional advantages, not the least of which is the ability to communicate the procedures to someone else. [16]
Since much nonverbal reasoning depends on visual imagery, many people wonder what more exposure to video will do to children's abilities to gain these "higher dimensions of human experience." Although I haven't heard anyone suggest that TV has improved kids' spiritual natures, one noted drama teacher told me she sees children of the video generation as better able to handle a "multiplicity of images, less stuck in narrative chronology." "The camera is a dreamer," she pointed out, that encourages their imaginations. [17] Other teachers say just the opposite. "They have lost the ability to visualize -- all their pictures have been created for them by someone else, and their thinking is limited as a result."
Curiously enough, however, visual stimulation is probably not the main access route to nonverbal reasoning. Body movements, the ability to touch, feel, manipulate, and build sensory awareness of relationships in the physical world, are its main foundations. A serious question now becomes whether children who lack spontaneous physical play and time to experiment with the world's original thought builders (e.g., sand, water, blocks, mom's measuring spoons, tree-climbing, rock-sorting, examining a seashell or the leaf of a maple tree, etc.) will be short-circuited in experimentation with nonverbal reasoning. Children who are rarely alone may well miss out on some important explorations with the "mind's eye." Frantic lifestyles do not lend themselves to imagination and reflection any more than aerobics classes for toddlers encourage manipulation of life's mysteries. Inept language usage is a serious problem, but inept insights might well be an even greater disaster.
Alphabets and Changing Brains
If (or as ... ?) we shift our major modes of communication from books to video, handwriting to computer word processors, what happens to the evolution of the brain? Such shifts, along with changes in the related patterns of thought, have both prehistoric and historic precedent. It is generally assumed that when humans learned to speak to each other, not only habits but brains changed. The development of written language is also believed to have had cognitive consequences -- or at least accompaniments. Not only does literacy, itself, change thinking, but the brain is apparently so sensitive to the input it learns to process that even different forms of the alphabet may have different effects.
The Western alphabet, in particular, has been linked to (or blamed for, as you will) our form of scientific thought and our system of formal logic. In The Alphabet Effect, Robert Logan points out that Eastern alphabets such as Chinese ideographs ("picture writing") and the more linear, alphabetic-phonetic patterns of the West show differences that he relates to "right-brained" and "left-brained" modes of thought. Logan suggests that while alphabetic systems cannot cause social changes, their usage encourages different types of cultural -- and perhaps neural -- patterns.
During the so-called Dark Ages in the West, when reading and writing diminished, many major advances in inventions and manual technologies took place. Logan implies that liberation from the written alphabet may have enabled relatively more progress in the fields of practical arts, mechanical and agricultural inventions, and the establishment of the framework of Western democracy in the Magna Carta. These, he suggests, are related to more holistic functions of the brain that were freed-up by lessened demands to process the printed word. [18]
After the invention of the printing press, academic learning was revived, and a new infatuation with the objective empiricism of the scientific method took hold. As we saw above, some now dare to question the enduring utility of this stage of the progression. Is it time for another change?
Certain specific features of alphabets may be responsible for differences in the way the brain processes them. Dr. Derrick de Kerckhove of the McLuhan program in Culture and Technology at the University of Toronto has presented evidence that Indo-European alphabets (like ours), in particular, "have promoted and reinforced reliance on left-hemisphere strategies for other aspects of psychological and social information processing." The relevant features include left-to-right progression of print, precise differentiation of vowel patterns, which tap left-hemisphere auditory areas; and linear, speech-like order of sounds. These forms may have a "reordering effect" on mental organization and even brain structure, suggests de Kerckhove. [19]
De Kerckhove, who works at the McLuhan Institute in Ontario, Canada, points out that our more abstract ways of thinking -- which, he believes, do not come "naturally" to the human brain -- were probably imposed, at least in part, by this particular system of writing. The exact rendering of the writer's language afforded by our alphabet (in contrast to more open-ended symbol systems such as pictorial scripts, which allow a wider range of personal interpretation of what was said) takes the reader away from his own associations and interpretations and enables him to reach into the more abstract logic behind the writer's thinking.
If such fine-grained differences between writing systems might be able to change thinking and even the related brain structures, it seems evident that a major shift in "the ratio of the senses" (in McLuhan's words), from print to visual processing, could have even more dramatic effects.
Some observers find this possibility troubling. If print literacies get trampled under the hooves of technological innovation, what will happen to our thinking? Will we lose precision of thought along with precision of expression? Will our ability to communicate outside a face-to-face context become limited? What will happen to the disciplined analytical and inductive thinking that serve creative intuition? [20] While purely verbal thinking may, indeed, be "sterile," it is doubtless an important adjunct to higher-level reasoning and creativity.
. . .while nonlinguistic symbol systems such as those of mathematics and art are sophisticated, they are extremely narrow. Language, in contrast, is a virtually unbounded symbol system. . . the prerequisite of culture. In sum, we do not always think in words, but we do little thinking without them. [21]
Dr. Diane Ravitch, noted scholar and educational theorist, is worried about current attitudes that imply "a longing to get away from language, as though we would all be more primitive, more spontaneous, and more joyful. Then we could read each other's body language rather than have to communicate through written devices.
"Enemies of print literacy," she admonishes, are all too ready to say, "Well, man, this is where it's happening, let's go with the flow." But blind faith that change inevitably implies progress is just as foolish as refusing to accept new ideas at all. Throwing out the precision of language would be particularly dangerous at a time when balance is badly needed. Print and visual literacies can and should complement each other; visual images open doors to new modes of understanding, but print is still necessary for thoughtful analysis. [22]
This argument will probably assume greater urgency as the computer age forces us toward more analytic precision at the same time it demands visualization of new technological applications. Tension between visual and verbal reasoning, in fact, is a major kernel of the information-age paradox. Our children will need both.
THE CHALLENGE: EXPANDING MINDS
Technology has not yet reached the point where it can guide our children's mental development -- if it ever will, or should. Nor can children, without good models, shape their own brains around the intellectual habits that can make comfortable companions either of machines or their own minds in a rapidly changing world. Adults in a society have a responsibility to children -- all children -- to impart the habits of mental discipline and the special skills refined through centuries of cultural evolution. It is foolish to send forth unshaped mentalities to grapple with the new without equipping them with what has proven itself to be worthwhile of the old.
A prudent society controls its own infatuation with "progress" when planning for its young. Unproven technologies and changing modes of living may offer lively visions, but they can also be detrimental to the development of the young plastic brain. The cerebral cortex is a wondrously well-buffered mechanism that can withstand a good bit of well-intentioned bungling. Yet there is a point at which fundamental neural substrates for reasoning may be jeopardized for children who lack proper physical, intellectual, or emotional nurturance. Childhood -- and the brain -- have their own imperatives. In development, missed opportunities may be difficult to recapture.
The growing brain is vulnerable to societal as well as personal neglect. The immediate effects of ecological folly and misdirected social planning are already swelling the rolls of physically endangered brains. The more subtle legacies of television and adult expediency are being manifested in an erosion of academic and personal development for children from all walks of life. Their needs press heavily on our visions of the future.
While "progress" must be judiciously assessed, new developments are both needed and inevitable. Parents and teachers will need to broaden, perhaps even redefine, traditional parameters of intelligence and learning, not simply because of the changing priorities of future technologies but also because of present realities. This book has depicted a growing crisis in academic learning, created in large part by an alienation of children's worlds -- and the mental habits engendered by them -- from the traditional culture of academia. Young brains have been modeled around skills maladaptive for learning. Merely lamenting this fact, however, does not alter the reality or rebuild the brains. Nor does choking our young with more didacticism make them learn to think.
Closing the gap between wayward synapses and intellectual imperatives will not be easy. It will certainly not be accomplished by low-level objectives, such as memorization of information, that can now be accomplished far more efficiently by even the least intelligent computer. Human brains are not only capable of acquiring knowledge, they also hold the potential for wisdom. But wisdom has its own curriculum: conversation, thought, imagination, empathy, reflection. Youth who lack these "basics," who cannot ponder what they have learned, are poorly equipped to become managers of the human enterprise in any era.
The final lesson of plasticity is that a human brain, given good foundations, can continue to adapt and expand for a lifetime. Its vast synaptic potential at birth can bend itself around what is important of the "old" and still have room for new skills demanded by a new century. A well-nourished mind, well-grounded in the precursors of wisdom as well as of knowledge, will continue to grow, learn, develop -- as long as it responds to the prickling of curiosity. Perhaps this quality, above all, is the one we should strive to preserve in all our children. With it, supported by language, thought, and imagination, minds of the future will shape themselves around new challenges -- whatever they may be. But if we continue to neglect either these foundations or the curiosity that sets them in motion, we will truly all be endangered.