Page 1 of 4

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:06 am
by admin
Definition of Infinity Expands for Scientists And Mathematicians
by Sharon Begley
July 29, 2005; Page B1

At the Hotel Infinity, managers never have a problem with overbooking. If you arrive with a reservation and find that the hotel's infinite number of rooms (named 1, 2, 3 and so on, forever) are all occupied, the manager simply moves the guest in Room 1 to Room 2, the guest in Room 2 to Room 3, and on and on until every guest has a room and you get Room 1. In an "infinite set" such as the rooms at the Hotel, whatever you thought was the highest-numbered member of that set isn't.

The next time you're in town, you have an infinite number of friends in tow, and you try the Hotel Infinity again. The manager is happy to accommodate a party of infinity even though his infinite rooms are, again, full. Knowing that your friends have an odd aversion to even numbers, he moves the guest in Room 1 to Room 2, the guest in Room 2 to Room 4, the guest in Room 3 to Room 6, etc. You and your friends get the odd-numbered rooms, of which there are, conveniently, an infinite number.

If thinking of infinities makes your head spin, you're in good company. Georg Cantor, the early-20th-century mathematician who did more than anyone to explore infinities, suffered a nervous breakdown and repeated bouts of depression. In the 1930s, some fed-up mathematicians even argued that infinities should be banned from mathematics. Today, however, infinities aren't just a central part of mathematics. More surprising, says cosmologist John Barrow of the University of Cambridge, England, in his charming new tome, "The Infinite Book," scientists who study the real world are having to take infinities seriously, too.

Not long ago, if the solution to an equation included an infinity, alarms went off. In particle physics, for instance, "the appearance of an infinite answer was always taken as a warning that you had made a wrong turn," Prof. Barrow says. So physicists performed a sleight-of-hand, subtracting the infinite part of the answer and leaving the finite part. The finite part produced by this "renormalization" was always in "spectacularly good agreement with experiments," he says, but "there was always a deep uneasiness" over erasing infinities so blithely. Might physicists, blinded by their abhorrence of infinities, have been erasing a deep truth of nature?

Suspecting just that, some scientists now see infinities "as an essential part of the physical description of the universe," says Prof. Barrow. For instance, Einstein's equations say the universe began in, and will end with, an infinity of density and temperature, something long regarded as a sign that his theory breaks down at the beginning and end of time. But in a 2004 paper, Prof. Barrow calculated that Einstein's equations allow a point of infinite pressure to arise throughout the expanding universe at some time in the future.

In addition to coming around to the view that infinities might be real, rather than signs of a problem with Einstein's and other theories, some cosmologists suspect that infinities at the beginning and end of time "have quite different structures," Prof. Barrow writes. Just as at the hotel, not all infinities are equal. And that is making the weird math of different-size infinities suddenly relevant in the physical world, too.

To mathematicians, "equal" means you can match the elements in one set to the elements in another, one to one, with nothing left over. For instance, there is an infinite number of integers: 1, 2, 3, 4 . . . . There is also an infinite number of squares: 1, 4, 9, 16 . . . . You can match every integer with a square (1 with 1, 2 with 4, and so on), so the two sets are equal, as long as you never stop matching. But wait: Every square also belongs to the set of integers. That suggests that the set of integers is larger, since it contains all the squares and then some. Surely there are more integers than squares, right?

Actually, no. Before his breakdown, Cantor asserted that if the elements in one infinite set match up one to one with the counting numbers, then those infinities are of equal size. The infinity of squares and the infinity of integers (and the infinity of even numbers) are therefore equal, even though the infinity of integers is denser.

Decimals, however, are different, mathematicians say. There is an infinite number of them, too, but this infinity is larger than the infinity of integers or squares. Even in the tiny space between zero and 1, there's an infinite number of decimals with no certainty as to what comes next. What comes after .1, for example? Is it .11 or .2?

Just as mathematicians found a distinction among infinities, so scientists trying to fathom the physical world may need to distinguish among infinities.

In his study of infinities, Prof. Barrow noticed that a universe like ours that seems infinite in size, extending without bound, presents curious ethical dilemmas. An infinite universe must have infinite amounts of good and evil, he writes. Nothing we do, or fail to do, can change that, for adding a bit of good to an infinite amount of good still leaves infinite good, and subtracting a bit of evil from an infinite amount of evil still leaves infinite evil. "What is the status of good and evil," he wonders, "when all possible outcomes actually arise somewhere" ... or sometime? Small wonder infinity drove Cantor mad.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:07 am
by admin
Early Cancer Detection Doesn't Always Give Patient an Advantage
by Sharon Begley
August 26, 2005; Page B1

(See Corrections & Amplifications item below.)

When Richard Bloch, co-founder with his brother of H&R Block, died of heart failure in 2004 at age 78, he was a medical success story. In 1978, he was diagnosed with terminal lung cancer. A decade later, he had colon cancer. He beat both, and went on to found a cancer hotline, a survivors group and other services. He was counted as someone whose cancer was detected early enough to save his life.

Should he have been?

Nothing has greater intuitive appeal than the claim that cancer screening leads to early detection, which leads to longer survival. Whether it is the PSA test for prostate cancer, mammograms, endoscopy for colon cancer or -- in the wake of Peter Jennings' untimely death -- X-ray screening for lung cancer, intuition screams that the earlier a cancer is caught, the better the odds that you'll be alive in five years. Like Mr. Bloch.

Cancer researchers are now augmenting that intuition with data, and the result isn't pretty. The impact of cancer screening "on reducing cancer mortality," says Elaine Jaffe of the National Cancer Institute, "still isn't proven for a number of cancers."

How can that be?

Part of the answer is that many tumors are so slow to progress -- indolent, scientists call them -- that they'll hang out in an organ for decades with no ill effects. "Early on, the idea of an indolent tumor was just a theoretical construct," says Barnett Kramer of the NCI. But indolence was found to characterize many neuroblastomas (a cancer of the nervous system), "and then it was found that tumors in the prostate, lung and now breast can also be indolent."

That doesn't mean cancer screening is useless. Without question, some of the tumors it finds would, if left untreated, have killed patients before their time, and some of the improvement in survival rates after breast cancer likely reflect earlier detection. But you can be misled into attributing the decades of life you enjoy after "beating" cancer to early detection and treatment rather than to the properties of the tumor itself.

Left to its own devices, the tumor might well have left you alone until you died of something else entirely. "Overdiagnosis of cancer as a result of screening is the rule rather than the exception," says Dr. Kramer.

This overdiagnosis isn't the false positives that tests such as mammograms can spit out. In that case, what is detected might look like cancer, but on further examination is not. False positives cause great anxiety and cost, as patients undergo more tests. But diagnosing an indolent cancer is arguably worse, as patients undergo treatments that often have debilitating, even dangerous, side effects.

Overdiagnosis has another effect: on perceptions of progress in the war on cancer. More-sensitive screening means tumors are detected at ever-earlier stages. Let's say that, as a result of such a screening, a patient begins treatment on Aug. 26, 2005. She does well, and celebrates her five-year survival on Aug. 26, 2010.

If she succumbs to a recurrence or a spread of her initial cancer in, say, 2015, she still counts as a five-year survivor. But if she had a slow-growing cancer she might have made it to 2015 anyway, without early diagnosis and treatment. She is scored as a victory for cancer warriors, but in fact they didn't buy her a single extra day of life. All she got was more years knowing she had a dreaded disease.

"The improvement in long-term mortality may be due to the higher proportion of small or slow-growing tumors being detected, which means you start counting earlier," says Dr. Jaffe. That's why longer survival, measured from the time of diagnosis, is a misleading measure of progress against cancer, and no substitute for reductions in mortality.

The more scientists study cancers, the more indolent ones they discover. Researchers in Japan, for instance, find that CT scans detect almost as many lung lesions in nonsmokers as in smokers. But since nonsmokers have a mortality rate from lung cancer less than 10% that of smokers, the vast majority of what CT scans picked up would never have progressed to anything life-threatening. And a Mayo Clinic study found that although X-rays detect lung cancers at earlier stages, and lead to more five-year survivors, early detection does not lower death rates.

For colon cancer, the fecal occult blood test "does decrease your risk of dying of this cancer," says Dr. Kramer. "But for colonoscopy and sigmoidoscopy, which appeal to our intuition [about early detection], the evidence is not great." They pick up polyps earlier, but not all polyps become cancers, "and we don't know what proportion would lead to death."

The Pap test for cervical cancer has saved lives, but many of the abnormal cells it finds wouldn't go on to become cancer. Most women with low-grade or even high-grade lesions would have been fine anyway. Similarly, the PSA test for prostate cancer picks up tumors that are biologically nonaggressive.

The discovery that many tumors are innocuous casts doubt on the value of new screening tests. "You may fool yourself into thinking a test is twice as sensitive," says Dr. Kramer, "but the only extra cancers it picks up are those that wouldn't have harmed the patient.

Corrections & Amplifications:

Barnett Kramer is associate director for Disease Prevention at the National Institutes of Health. This article incorrectly said Dr. Kramer is with the NIH's National Cancer Institute, which was his previous employer.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:11 am
by admin
Evolutionary Psych May Not Help Explain Our Behavior After All
by Sharon Begley
April 29, 2005; Page B1

Like almost everyone else, David J. Buller says he was "completely captivated" by evolutionary psychology, and no wonder. This field claims to explain human behaviors that seem so widespread we must be wired for them: women preferring high-status men, and men falling for nubile babes; stepfathers abusing stepchildren. Even the more troubling claims, such as one saying rape gave our male ancestors a reproductive edge, have caught on, as laypeople and scientists alike say, yeah, that makes sense. In a nutshell, evo psych argues that Pleistocene humans who engaged in certain behaviors left more descendants than did contemporaries who did not engage in those behaviors. As a result, we, their descendants, are wired for the behaviors.

But as Prof. Buller, a professor of philosophy at Northern Illinois University, dug deeper, he concluded that the claims of evo psych are "wrong in almost every detail" because the data underlying them are deeply flawed. His book "Adapting Minds," from MIT Press, is the most persuasive critique of evo psych I have encountered.

Take the stepfather claim. The evolutionary reasoning is this: A Stone Age man who focused his care and support on his biological children, rather than kids his mate had from an earlier liaison, would do better by evolution's scorecard (how many descendants he left) than a man who cared for his stepchildren. With this mindset, a stepfather is far more likely to abuse his stepchildren. One textbook asserts that kids living with a parent and a stepparent are some 40 times as likely to be abused as those living with biological parents.

But that's not what the data say, Prof. Buller finds. First, reports that a child living in a family with a stepfather was abused rarely say who the abuser was. Some children are abused by their biological mother, so blaming all stepchild abuse on the stepfather distorts reality. Also, a child's bruises or broken bones are more likely to be called abuse when a stepfather is in the home, and more likely to be called accidental when a biological father is, so data showing a higher incidence of abuse in homes with a stepfather are again biased. "There is no substantial difference between the rates of severe violence committed by genetic parents and by stepparents," Prof. Buller concludes.

On a lighter note, evolutionary psychology claims that men prefer fertile, nubile young women because men wired for this preference came out ahead in the contest for survival of the fittest. The key study here asked 10,047 people in 33 countries what age mate they would prefer. The men's answer: a 25-year-old.

But the men were, on average, in their late 20s. One of the most robust findings about human behavior is that people prefer a mate who matches them in education, class and religious background, ethnicity -- and age. The rule that "likes attract" is enough to explain why young men prefer young women. Besides, if you scrutinize the data, you find that 50-ish men prefer 40- something women, not 25-year-olds, undermining a core claim of evo psych.

The argument that Stone Age women preferred good providers, and that today's women are therefore wired to see a big bankroll as the ultimate aphrodisiac, is also shaky. Among some hunter-gatherers today, young mothers receive more food from their mothers than from their husbands. That makes even the theoretical basis for the claim -- that women who sought good providers had an evolutionary edge -- problematic.

The empirical basis is no better. On average, 25-year-old women say they prefer 28-year-old men, even though 50-year-old men have much more of the high status and resources that evo psych says they are wired to lust after. Again, likes attract more than "good providers" do.

In defense of the "good provider" theory, evolutionary psychologists cite studies of female college students asked to choose their ideal mate. Shown photos of young men -- one in the uniform of a fast-food worker, one looking like a middle manager, the third like a CEO -- they indeed choose one of the latter two. But just as people prefer to marry someone near them in age, they prefer to marry someone like them socioeconomically. The fact that female college students, usually middle- or upper-class, prefer medium- or high-status men could simply reflect their preference for a man who looks as though he comes from the same socioeconomic background, Prof. Buller points out. Also, earning capacity is a sign of other traits, such as education level and socioeconomic background. So although it seems that the women are being asked how important their mate's income is, they are likely using income as a sign of the other things they care about. Evolutionary psychology has a more fundamental problem than the shakiness of its data and the fact that the data can be interpreted in more than one way. Why, if child abuse by stepfathers is such a great evolutionary strategy, do many more stepdads love and care for their stepchildren than abuse them? And why, if rape is "such an advantageous reproductive strategy, [is it that] there are so many more men who do not rape than who do," asks primatologist Frans de Waal of Emory University, Atlanta.

After "Adapting Minds," it is impossible to ever again think that human behavior is the Stone Age artifact that evolutionary psychology claims.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:11 am
by admin
Fluoridation, Cancer: Did Researchers Ask The Right Questions?
by Sharon Begley
July 22, 2005; Page B1

When health officials decided to add fluoride to the water supply of Grand Rapids, Mich., in 1945, they plunged ahead despite the lack of a rigorous, large-scale study of the risks and benefits. And for most of the next 60 years, fluoridation research has gone pretty much like that. It has not been science's finest hour.

Questions about fluoridation have returned with renewed vigor because of allegations of scientific misconduct against a prominent researcher at the Harvard School of Dental Medicine. The Environmental Working Group, an advocacy organization in Washington, charged last month that Chester Douglass misrepresented an unpublished study about bone cancer and fluoridated tap water. In written testimony to the National Research Council last year, Dr. Douglass said he had found no evidence that fluoridation increased risk of osteosarcoma, a rare bone cancer. But a 2001 study he cited, and oversaw, found that boys who drink fluoridated water have a greater risk of developing the disease. (Dr. Douglass did not respond to requests for comment.)

More interesting than what Dr. Douglass said or didn't say, however, is the study he swept under the rug. It was conducted by one of his doctoral students, Elise Bassin. She started with the same raw data as her mentor -- 139 people with osteosarcoma and 280 healthy "controls" -- but saw a way to improve on it. Since most of the 400 people diagnosed in the U.S. each year with osteosarcoma are kids, and since any ill effect of fluoride would likely come when bones are growing most quickly, she focused on the 91 patients who were under 20.

Her result: Among boys drinking water with 30% to 99% of the fluoride levels recommended by the U.S. Centers for Disease Control and Prevention, the risk of osteosarcoma was estimated to be five times as great as among boys drinking nonfluoridated water. At 100% or more, the risk was an estimated seven times as high. The association was greatest for boys six to eight.

To be sure, one study proves nothing. Moreover, Dr. Bassin hasn't published her core findings (though in 2004 she and colleagues published a description of their methodologies). As Boston University epidemiologist Richard Clapp says, "Peer review picks up things that even doctoral students at Harvard might miss."

So I asked scientists to read the study. BU's Kenneth Rothman, founding editor of the journal Epidemiology, called it "of publishable quality." Zeroing in on young patients, he said, was good science: "If there were an adverse effect of fluoride, it's possible an effect of early exposure would be manifest in the first 20 years of life -- but not after." Looking at all ages, in other words, could conceal any link between fluoridation and cancer.

Besides focusing on kids, Dr. Bassin and her colleagues found out where each cancer patient ever lived, and what kind of water they drank when. Other studies have just noted what water a patient was drinking at the time of diagnosis. The problem with that is, you risk classifying someone as drinking nonfluoridated water who in fact drank fluoridated water when it mattered -- in childhood. The result is that the osteosarcoma rates of people drinking fluoridated water might look no different from those of people drinking nonfluoridated. "She did great shoe-leather epidemiology," says William Maas, head of oral health at the CDC and a supporter of fluoridation.

Previous studies have been contradictory. A 1991 animal study by the National Toxicology Program concluded that fluoride might raise the risk of osteosarcoma, but only in male rats, not female. Also in 1991, a scientist at the National Cancer Institute found an "unexplained increase" in osteosarcoma in men under 20 in fluoridated communities. Most human studies, though, provide "no credible evidence for an association between fluoride in drinking water and the risk of cancer," said a 1993 NRC report.

But when you look carefully at the negative studies, you have to wonder. Some investigated a link to all cancers; because osteosarcoma is rare, an increase would be unlikely to show up in that vast sea. Other studies were tiny, or included adults as old as 84, which would wash out effects that target kids. Most categorized osteosarcoma patients as drinking fluoridated or nonfluoridated water based on where they lived at diagnosis, not as kids. Concerned about such lapses, the NRC report called the studies "of limited sensitivity."

Even if fluoridation causes just a few hundred cases of osteosarcoma every year, does the public health benefit justify that risk? "When we started fluoridating water, we thought to get the benefits it would have to get incorporated into the enamel before the tooth erupted," which happens only if you swallow it, says the CDC's Dr. Maas. But that turns out not to be so. Topical fluoride, as in gels and toothpaste, works at least as well.

Most proponents now say fluoridation cuts the rate of tooth decay 18% to 25%. How much is that? Less than one tooth surface. "The absolute impact of 18% or even 25% is low," says Steven Levy of the University of Iowa, who supports fluoridation.

The next authoritative report on fluoridation will be the NRC's. One scientist close to the committee thinks it may be released this fall, months later than expected. "We thought this was going to be routine," he says. "It wasn't." With fluoridation, it seldom is.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:12 am
by admin
'Gene Pill' Offers Alternative to Shots
by Sharon Begley
Staff Reporter of THE WALL STREET JOURNAL
June 21, 2005; Page D8

For people who have to inject themselves regularly with insulin to treat diabetes, erythropoietin to treat anemia or other protein drugs for various diseases, there may be hope for an end one day to being a human pin cushion.

University scientists and a biotechnology company are developing an alternative to injecting the drugs. Inspired by gene therapy, it is called a "gene pill" and contains the gene for a disease-treating protein rather than the protein itself.

Many helpful drugs are actually proteins. But proteins make poor pills, because they are broken down in the gut or poorly absorbed, with the result that they don't deliver the intended benefit. The only choice is to inject them.

However, the body itself makes proteins all the time in cells, following the instructions of genes in the cells. Now researchers are working on delivering genes for medicinal proteins to the body through a pill. A study in lab animals showed that, not only do the genes survive their digestive trip intact, they also get incorporated into cells of the gut -- which then produce the helpful proteins for the body to use.

Although the research is preliminary, with studies in humans still on the drawing boards, outside experts agree it shows promise. The gene pill "could provide an effective alternative method for delivering protein drugs currently administered only through injection," said David Klonoff, clinical professor at the University of California, San Francisco, and editor in chief of the journal Diabetes Technology & Therapeutics, which published the study in its June issue.

Today's protein drugs, such as growth hormone to treat dwarfism and blood factors to treat hemophilia, have several drawbacks. Patients often skip doses because the drugs have to be injected, rather than swallowed. Also, these proteins are either extracted from human cadavers or animal tissue, which is slow and inefficient, or -- more common -- produced through recombinant DNA, which is expensive. Moreover, injectable drugs are difficult and expensive to store, limiting their use in developing countries.

The gene pill is designed to avoid these problems. The cells lining the intestine are the only ones that take up the DNA for the therapeutic protein, which the cells release into the bloodstream.

The gene itself stays out of the bloodstream, with the result that it can't reach tissues where it might pose a risk. In some trials of traditional gene therapy, in which a virus ferries a therapeutic gene into a patient's cells, the virus has caused dangerous inflammation or disrupted cancer-suppressing genes, causing two deaths and leading the Food and Drug Administration to suspend some gene-therapy trials in the U.S.

Because cells of the intestine are sloughed off, excreted and replaced every few days, there is little danger that the inserted gene will go astray or deliver too high a dose of the therapeutic protein, says Stephen Rothman, professor emeritus at UCSF and a developer of the gene pill. The pill would be taken every two days or so.

In 1997, Dr. Rothman and three UC colleagues founded Genteric Inc., of Alameda, Calif., which is developing the gene pill commercially. UCSF holds four patents on the gene pill, for which it has granted an exclusive license to Genteric. Dr. Rothman has a financial stake in the closely held company.

In the new study, he and his colleagues gave lab rats and mice several different genes, through a tube. They found that the intestine cells do take up the gene and make the protein, and that they secrete the protein into the blood. When they used the gene for insulin, they showed that the insulin not only gets into the blood but also produces a therapeutic response, in this case lowering levels of blood sugar in rats with diabetes.

"Our approach seeks to avoid many of the problems with current approaches to gene therapy," Dr. Rothman says. In current approaches, once the gene is given to a patient it can't be undone even if it causes harm, as in the patients who developed cancer. In contrast, the effects of the gene pill last only a day or two, until the patient takes another pill.

Experts in gene therapy say they welcome variations on the standard approach. "For relatively small molecules like insulin, this should perhaps work," says Katherine High, a gene-therapy pioneer and professor of pediatrics at the Children's Hospital of Philadelphia.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:12 am
by admin
Grandma's behavior while pregnant impacts lineage
by Sharon Begley
Friday, May 13, 2005

Although life offers no guarantees, parents-to-be can increase their chances of having a healthy baby by, among other things, undergoing prenatal testing and making sure mom has a healthy pregnancy.

But almost 2,500 years after Euripides noticed that "the gods visit the sins of the fathers upon the children," scientists are discovering that nature can be even crueler than the ancient Greek imagined: It can visit the sins of the grandparents on the children.

Such "transgenerational" effects are the latest focus of a growing field called fetal programming, or the fetal origins of adult diseases. It examines how conditions in the womb shape physiology in a way that makes people more vulnerable decades later to cardiovascular disease, diabetes, immune problems and other illnesses usually blamed on genetics or lifestyle, not on what arrived via the placenta. If a fetus is poorly nourished, for instance, it can develop a "thrifty phenotype" that makes it really good at getting the most out of every meal. After birth, that lets it thrive if food is scarce, but it's a recipe for Type 2 diabetes in a world of doughnuts and fries. Poor fetal nutrition can lead to hypertension, too: If it causes the fetus to produce too few kidney cells, the adult that the fetus will become won't be able to regulate blood pressure well.

Now, in a finding that seems to put our fate even further outside our control, researchers are seeing generation-skipping effects.

Last month, scientists reported that a child whose grandmother smoked while pregnant with the child's mother may have twice the risk of developing asthma as a child whose grandma didn't flood her fetus with carcinogens. Remarkably, the risk from grandma's smoking was as great as or greater than from mom's. Kids whose mothers smoked while pregnant were 1.5 times as likely to develop childhood asthma as children of nonsmoking moms. Kids whose grandmothers smoked while pregnant with mom were 2.1 times as likely to develop asthma, scientists reported in the journal Chest.

The harmful effects of tobacco, it seems, can reach down two generations even when the intervening generation -- mom -- has no reason to suspect her child may be at risk.

"Even if the mother didn't smoke, there was an effect on the grandchild," says Frank Gilliland of the University of Southern California, Los Angeles, who led the study of 908 children. "If smoking has this transgenerational effect, it's a lot worse than we realized."

What causes the grandma effect? One suspect is DNA in the fetus's eggs (all the eggs a girl will ever have are made before birth). Chemicals in smoke might change the on-off pattern of genes in eggs, including genes of the immune system, affecting children who develop from those eggs. Men whose mothers smoked don't seem to pass on such abnormalities, probably because sperm are made after birth.

Animal data hint at other grandma effects. Last week, scientists reported the first discovery that obesity and insulin resistance, as in Type 2 diabetes, can be visited on the grandkids of female rats that ate a protein-poor diet during pregnancy, lactation or both. Again, this occurred even when those rats' offspring, the mothers of the affected grandkids, were healthy, Elena Zambrano of the Institute of Medical Sciences and Nutrition, Mexico City, and colleagues report in the Journal of Physiology.

The findings, says Peter Nathanielsz of the University of Texas Health Sciences Center, San Antonio, "stretch the unwanted consequences of poor nutrition across generations."

In people, the type of "nutritional insult" to the fetus doesn't seem to matter. Too few calories, too little protein, too few other nutrients can all lead to diabetes, hypertension and other ills decades later. "That suggests that what links diet to adult diseases is something quite fundamental," says Simon Langley-Evans of the University of Nottingham, England. The key suspects: changes in DNA activity in the fetus or in the balance of hormones reaching it via the placenta.

Alarmingly, the list of what can be passed along to the next generation is growing. If you are undernourished as a first-trimester fetus, you won't pad your hips and thighs with enough fat tissue. If, as a child or adult, you take in more calories than you expend, the extras get stored in and around abdominal organs rather than on the thighs and hips, says Aryeh Stein of Emory University, Atlanta. One result is a body shaped like an apple (which brings a higher risk of heart disease). Another is a higher risk of gestational diabetes, in which blood glucose levels rise during pregnancy and too much glucose reaches the fetus. Babies born to moms with gestational diabetes have a higher risk of Type 2 diabetes.

When undernourished fetuses grow into adolescents, they don't respond as well to vaccines as babies who had a healthy gestation, scientists led by Thomas McCune of Northwestern University, Evanston, Ill., find. One reason may be that the third trimester is a critical time for development of the thymus, which produces the immune system's T cells. When immune-compromised girls become pregnant, they have less chance of having a healthy pregnancy and a healthy baby. Score another for the grandma effect.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:13 am
by admin
How Brief Drop in Cars Can Trigger Tie-Ups, And Other Traffic Tales
by Sharon Begley
July 1, 2005; Page B1

If you plan to hit the roads like the zillions of other drivers this holiday weekend, Avi Polus has a word of advice: patience.

A transportation engineer at Technion-Israel Institute of Technology in Haifa, Prof. Polus's concern isn't drivers' collective blood pressure but traffic flow. Like the growing number of other engineers and physicists who are hubcap-deep in the science of traffic, he is determined to explain infuriating mysteries such as phantom traffic jams (There's no bottleneck or accident at the front of this jam, so why weren't we moving?) and why a brief drop in volume can, paradoxically, trigger a long-lasting traffic jam.

Impatience on two-lane roads actually improves traffic flow, as antsy drivers pass slowpokes rather than letting a convoy form. On highways, however, "passing, aggressive behavior and lane changing is greatly detrimental to the flow," says Prof. Polus.

The reason is that chronic lane changing simulates the "weaving section" of a highway. If an off-ramp lies just beyond an on-ramp, entering drivers merge left (assuming ramps are on the right) and exiting drivers merge right, causing traffic to crisscross like mobile braids. When, in heavy traffic, many drivers change lanes again and again, trying to find the one that is moving faster, the same weaving effect kicks in, reducing the capacity of that section of road.

"Weaving is the worst condition for traffic flow," says Prof. Polus. Because drivers in heavy traffic brake when a car pulls into their lane, and because it takes time to get back up to speed, there are larger and constantly-changing gaps between vehicles. That invites yet more cars to change lanes, propagating a wave of stop-and-go traffic that cuts the number of cars in a stretch of road by about 10%, calculates Prof. Polus, who will present his work at the 16th International Symposium on Transportation and Traffic Theory at the University of Maryland this month. That may not sound so dire, but in rush hour the result is a five-mile backup, his calculations show. In congestion, be content with the lane you're in.

More and more scientists are modeling traffic with equations from the branch of math called nonlinear dynamics, which describes systems that suddenly jump from one state to another. Like water that suddenly freezes, flowing traffic can spontaneously seize up, beginning at a single point of crystallization (the idiots who braked to rubberneck) and causing a wave of high density to spread backward.

Lane closures, on ramps, uphill, chronic lane changing and other "inhomogeneities" in traffic flow can all trigger a density wave, Martin Treiber of Dresden University of Technology has shown in mesmerizing simulations (www.traffic-simulation.de/). One result can be "phantom" jams, which occur so far upstream of the bottleneck that the congestion there has long cleared by the time drivers at the back of the pack reach it. As a result, they never see the snafu that flipped smooth flow into a stop-and-go mess. By one estimate, three-quarters of traffic jams are phantoms.

Carlos Daganzo of the University of California, Berkeley, was puzzled by what highway sensors showed: When congested traffic forms upstream of a bottleneck, the rate at which cars at the front leave the congested area decreases. "It's as if, when a line forms at the popcorn stand, the server slows down, so people leave with their popcorn at a slower rate just because there are more people waiting," he says.

Yet the counterintuitive effect is seen time and again, and in a recent study he and colleagues figured out why. The congestion causes cars to jockey across lanes, ever on the lookout for the faster one. Lane changing increases the gaps between cars, as drivers slow down when someone barges in front of them. Bigger gaps means fewer cars per second leaving the front of the jam.

If that seems counterintuitive, consider that briefly reducing volume can trigger a stop-and-go wave. Within the region with suddenly fewer cars, perhaps because a long funeral cortege just exited, the emptier road entices drivers to speed up ("Open road -- yes!"). But sooner or later, Prof. Treiber notes, these drivers catch up to a denser, slower-moving region. The ensuing braking can trigger the dreaded density wave.

Most jams occur way before a road reaches its capacity, and the culprits are all around you. Even in heavy but moving traffic, inhomogeneities would have much less effect if drivers had faster reaction times. When merging traffic causes the driver in front of you to brake, you do so as well, unless you enjoy fender benders. But because braking takes time, the gap between you and the car ahead shrinks, explains Prof. Treiber. You slow even further until the gap reaches a size you are comfortable with. Result: You are now traveling even more slowly than the car whose braking triggered the stop-and-go wave in the first place. The car behind you does the same, and the effect propagates backward, often for miles.

You can lessen this effect, however. Prof. Treiber suggests looking a few cars ahead so you know when and how much to brake. "If you brake just in time, you can usually safely brake less," he says, "which improves the flow." Consider it a good deed.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:13 am
by admin
Hurricane Forecasters Try Model That Focuses On Chances of Landfall
by Sharon Begley
June 10, 2005; Page B1

Forecasting the future is tough enough. But when it comes to hurricanes, "predicting" the past is no cakewalk either.

To predict the coming hurricane season, scientists look at climate factors in late summer that are linked to hurricane activity. Then they see how well they can predict those factors -- ocean temperatures and currents, El Niño conditions, wind patterns -- and thus the number and intensity of coming storms. Next, they test this model, plugging in the numbers from a particular year in the past and seeing if the model correctly "predicted" that year's hurricanes. If not, they fine-tune equations, adjust the weight they give each factor ... and order in crystal balls and chicken entrails.

I exaggerate only slightly. But seasonal hurricane forecasts clearly need help. In May 2004, the National Oceanic and Atmospheric Administration, home of the nation's meteorologists, forecast a 50% chance of a higher-than-normal Atlantic hurricane season, with two to four major hurricanes. That August, NOAA revised the odds -- down: It pegged the chance of an unusually intense season at only 45%. The respected team at Colorado State University, Fort Collins, also lowered its forecast in August, just days before Charley formed.

But as NOAA wrote in a postmortem after the rampages of Charley, Frances, Ivan and Jeanne caused a record $22 billion in insured losses and killed at least 3,100 people, 2004 "had well-above-normal activity," with six major hurricanes.

With the 2005 hurricane season under way as of June 1, the unforeseen (by many) devastation of 2004 has led critics of the traditional methodology to argue that it is time to throw out the standard crystal ball, which relies heavily on the sea surface temperatures from which storms draw their fury. They are also calling for forecasters to focus not just on how many storms will form but on how many will make landfall. That's one of the toughest parts of forecasts, but it is also where scientists are making surprising progress. In a promising new model, the number of hurricanes making landfall in the U.S. depends on conditions you'd never suspect.

When atmospheric scientists Mark Saunders and Adam Lea, of the Tropical Storm Risk unit at University College London, scrutinized 54 years of data and looked for correlations between wind patterns and the hurricanes reaching U.S. shores, one set of measurements stood out: wind patterns 2,000 to 22,000 feet up, over six regions of North America and the eastern tropical Pacific and North Atlantic oceans during July. How the strength and direction of these winds deviate from the norm, says Prof. Saunders, "is strongly linked to upcoming hurricane activity."

The reason is that wind patterns either favor or block hurricanes from making landfall. For instance, when the usual high-pressure area around Bermuda is shifted north and is stronger than usual in July, it tends to stay that way. "Once these wind patterns are set up in July, they persist through October," says Prof. Saunders.

The Bermuda high is a crucial factor in determining if a hurricane will make landfall, agrees Steve Smith, an atmospheric physicist at Carvill America, a reinsurance intermediary in Chicago. "From 2000 to 2003, the Bermuda high was closer to Europe and steered hurricanes away from the U.S. coast. But last year it was more westerly," he says. Parked off the U.S. coast, it generated winds that blew hurricanes onto land.

Oddly, winds over the Rocky Mountains have been even better hurricane harbingers. Strong southerly winds over the Rockies in July set up a low-pressure zone over the western Gulf of Mexico. That produces steering winds that push hurricanes toward the Gulf Coast and Florida. Winds over the tropical east Pacific strengthen the low pressure over the Gulf, setting up a wind pattern that arcs north to drive storms onto land.

Measured by its ability to retrodict past hurricane seasons from the wind anomalies in July of those years, Tropical Storm Risk is twice as precise as the sea-temperature method. Its August 2004 forecast said the chance of an unusually intense hurricane season was 86%, compared with NOAA's 45%.

Its 2005 forecast, issued this week, says there is an 86% chance that landfalling hurricanes will put 2005 in the top one-third historically, with two to five intense hurricanes. The Colorado team agreed, upping an earlier forecast to eight hurricanes, half of which would be whoppers with sustained winds above 110 m.p.h.

Get used to it. Surface temperatures in the Atlantic have been elevated since 1995, relative to an historical average that goes back 150 years, notes NOAA's Stanley Goldenberg. From 1995 to 2000, the number of hurricanes almost doubled from the historical norm. Elevated sea temperatures might be part of a normal, 50-year cycle, "but you have to wonder if it is also linked to global warming," says Prof. Saunders.

Either way, "2004 was not unprecedented," says Dr. Smith. "Simple statistics say the return period for storm losses like those of 2004 is 50 to 70 years. But there is reason to believe it might be shorter."

And reason, too, to believe the season will be earlier. As of yesterday, this year's first named storm, Arlene, was swirling through the Caribbean, almost two months ahead of 2004's first.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:14 am
by admin
Imprinted Genes Offer Key to Some Diseases -- And to Possible Cures
by Sharon Begley
June 24, 2005; Page B1

According to the old joke, the homely but brilliant male scientist married the gorgeous but dim model figuring their children would have her looks and his brains. He was crushed when they had her brains and his looks.

The scientist was clearly not among those studying a booming new area of genetics. If he had been, he would have known that whether a child's traits are shaped by mom's genes or dad's genes isn't a simple matter of recessiveness or dominance, let alone of pure luck, as the textbook wisdom says. Instead, some genes come with molecular tags saying (in biochemical-ese), "I come from mom; ignore me," or "You got me from dad; pretend I'm not here."

Such genes are called imprinted. Unlike recessive or dominant genes (such as for black or blond hair), which are composed of different molecules, these genes are identical except for the silencer tag sitting atop them.

The result is that if the active gene is defective, there is no working backup; a healthy but silenced gene from the other parent can't step into the breach. In the joke, mom's beauty genes and dad's brainy genes were silenced, leaving mom's dimwitted genes and dad's homely ones to call the shots.

No one has reliably identified genes for beauty or for brains, let alone figured out whether mom's or dad's count (or whether this explains male-pattern baldness). But real imprinted genes are hitting the big time. Imprinting may be one reason people seem to inherit conditions such as autism, diabetes, Alzheimer's disease, male sexual orientation, obesity and schizophrenia from only one side of the family. At least one biotechnology company is planning to scan the entire human genome for imprinted genes (detectable with a biochemical test), hoping to use the data to diagnose incipient cancers.

Almost all imprinting happens automatically, long before birth, but in some cases it can result from outside interference. Toxic chemicals, for instance, may eliminate the silencer tag, causing potentially harmful effects that can be transmitted to future generations. (Two points to readers who say, "Lamarck lives!")

The number of human genes where the parent-of-origin matters keeps rising. According to a new computer algorithm, about 600 mouse genes are likely to be imprinted, scientists at Duke University report in Genome Research. If that 2.5% rate holds for humans -- and virtually every mouse gene has a human counterpart -- then we have hundreds of imprinted genes, too.

Among the genes where the parent of origin matters are three on chromosome 10. Only the copies from mom, studies suggest, are turned on. One, expressed in the brain, is linked to late-onset Alzheimer's disease. Another is linked to male sexual orientation, and a third to obesity. With dad's contribution silenced, if there is anything unusual in the copy from mom, that will determine the child's trait. "For Alzheimer's, if the mutation is in dad's gene you'll never see an effect, but if it's in mom's you're at risk for the disease," says Duke's Randy Jirtle.

A gene on chromosome 9, linked to autism, seems to count only if it came from dad. One on chromosome 2 and one on 22 are associated with schizophrenia; only the copies from dad count. Having a family tree mostly free of these diseases is therefore no assurance of good health. If the disease runs on dad's side, his gene may be defective, and that is the one that matters.

As they discover more imprinted genes, scientists are seeing that the silencing tag can be knocked off, with dire consequences. An animal study published this month suggests how. When fetal rats were exposed to two toxic chemicals -- a fungicide called vinclozolin commonly used in vineyards and a pesticide called methoxychlor -- they grew up to have slower- and fewer-than-normal sperm, Michael Skinner of Washington State University and colleagues report in the journal Science. The abnormalities were inherited by the rats' sons, grandsons and great-grandsons.

"That environmental toxins can induce a transgenerational genetic change is a phenomenon we never knew existed," Prof. Skinner says. How does it occur? Probably not through harmful mutations, which become rarer with each generation. But imprinting changes, of which Prof. Skinner's group has detected 50 and counting, persist through the generations.

The ink is barely dry on the human genome project, but already researchers are onto the "second genetic code," or the pattern of silencers on our DNA. Using a technology called MethylScope ("methyl" is the DNA silencer), "we will map this second genetic code to see which genes are imprinted and identify any differences between normal and cancerous cells," says Nathan Lakey, chief executive of Orion Genomics, a closely held biotechnology concern.

Those differences may become the foundation for molecular diagnostic tests within three years, perhaps starting with colon cancer. Normally, the copy of a gene called IGF2 that you get from dad is active, the copy from mom silenced. In 10% of us, though, mom's copy has thrown off the silencer, leading to a greater risk of colorectal cancer. Detecting that unsilencing could provide an early warning of the disease.

Re: Sharon Begley's "Science Journal"

PostPosted: Tue Oct 29, 2019 2:15 am
by admin
Improved Formula In England, Girls Are Closing Gap With Boys in Math
by Jeanne Whalen and Sharon Begley
Wall Street Journal - March 30, 2005

LEICESTER, England -- In her 10th-grade math class, Frankie Teague dimmed the lights, switched on soothing music and handed each student a white board and a marker. Then, she projected an arithmetic problem onto a screen at the front of the room.

"As soon as you get the answer, hold up your board," she said, setting off a round of squeaky scribbling. The simple step of having students hold up their work, instead of raising their hands or shouting out the answer, gives a leg up to a group of pupils who have long lagged in math classes -- girls.

Ms. Teague's teaching methods are part of broad changes in how math is taught in England's classrooms. Starting in the late 1980s, England's education department worried that lessons relied too heavily on teachers lecturing and students memorizing. So it began promoting changes in teaching methods, textbooks and testing in both state-funded and private schools. The changes were designed to help all students, but educators have noticed a surprising side effect: Girls are closing a decades-old gender gap -- and by many measures outscoring the boys.

The English record goes against theories that boys are innately destined to dominate math and science -- a view that caused a firestorm after recent remarks by Harvard University President Lawrence H. Summers. In discussing the preponderance of men in elite university science and engineering positions, Mr. Summers said "issues of intrinsic aptitude" might explain why more males than females score at the highest levels on measures of mathematical and scientific ability.

Elaborating in the ensuing debate over his comments, however, Mr. Summers said in a letter to the Harvard faculty that his "January remarks substantially understated the impact of socialization and discrimination, including implicit attitudes." He added that his remarks about why more boys than girls score at the extremes on math tests and other assessments "went beyond what the research has established."

The English experience with math education suggests that gender differences, even those that seem innate and based in biology, do not lead inevitably to any particular outcome. That view fits into a broader current sweeping over how scientists think of genetics. Many now believe that traits that seem intrinsic -- meaning those grounded in the brain or shaped by a gene -- are subject to cultural and social forces, and that these forces determine how a biological trait actually manifests itself in a person's behavior or abilities. An "intrinsic" trait, in other words, does not mean an inevitable outcome, as many scientists had long thought.

"What's now in play is the question of what it means for a trait to be innate," says Eric Turkheimer of the University of Virginia. In 2003, a study led by Prof. Turkheimer found that the influence of genes on intelligence varies with social class: In well-off children, genes seem to explain most IQ differences, but in disadvantaged minority children environmental influences have a greater impact.

In another study, men carrying a gene linked to aggression and criminality were no more likely than other men to become violent adults -- unless they were neglected or abused as children, according to a 2002 article published in the journal Science. And last summer, scientists in Canada reported that rats carrying a "neurotic" gene became more jumpy than their peers only if their mothers neglected them. In rats with attentive moms, the same DNA sequence produced mellow animals.

"What we're learning is that culture and experience actually imprint themselves on the brain, on biology," says science historian Londa Schiebinger of Stanford University in Palo Alto, Calif. In other words, nature and nurture work together in a much more sophisticated way than many scientists had previously thought.

England didn't take math education for girls seriously until the mid-1970s, when new antidiscrimination laws and a flood of gender research raised concerns about equality in the classroom. At the time, boys passed the math portion of an exam taken by all 16-year-olds, called the O-Level exam, at significantly higher rates than girls did. And more boys than girls achieved the top grades, educators say.

Gender experts began holding training courses for teachers, encouraging them to include girls more in classroom discussion and raise their own expectations of what girls could accomplish. Educators also began checking textbooks to eliminate gender stereotypes and include more positive images of girls excelling in math and science.

A national curriculum, introduced in 1988, quickly added up to gains for girls. It required all students to take certain core subjects and prevented high-school girls from dropping out of math or science before age 16. The curriculum mandated that students learn to analyze mathematical theories to give them a deeper understanding of the topic. That had the side-benefit of helping girls because they -- for what experts suspect is a combination of biological and social reasons -- often excel at such analysis. Boys typically enjoy and excel at traditional problem-solving because many see it as a competition, educators say.

The government also replaced the O-Levels with a new exam called the General Certificate of Secondary Education, or GCSE. That new exam required children to write an analysis of statistical data or a mathematical formula in the weeks before the exam and turn in their papers on exam day. Math exams in England also now give students partial credit for showing their work, even if they ultimately reach the wrong answer. This gives an advantage to girls, who are typically more methodical in writing out the problem step-by-step, educators says. Scotland and Wales, which also make up Great Britain, have separate educational systems and exams.

Many math teachers in England attribute girls' rising scores to the changes in exam content, a view some scientists support. Leonard Sax, an American pediatrician and author of the book "Why Gender Matters," says there are hints that "girls' brains are built for complexity and boys' brains are built for speed." One of the most consistent findings in education, he notes, is that on time-constrained, high-pressure tests, boys on average do better than one would expect based on their classwork, while girls do worse. "There are no differences in what girls and boys can learn," Dr. Sax says. "If the environment is right, girls can excel to the same degree and in the same subjects that boys do."

In 1988, the first year of the new test for 16-year-olds, 45.6% of boys and 38.2% of girls scored passing grades of A through C, according to government statistics. By the 1990s, boys and girls passed the math GCSE at nearly equal rates, but boys still outnumbered girls in achieving the top scores.

In the mid-1990s, the government made a push to make lessons more interactive. That was a departure from the 1980s and early 1990s, when most lessons consisted of what teachers here call "chalk and talk," or standing at the board and lecturing. The new methods, while not specifically designed to benefit girls, draw more kids into the lesson and help shy girls speak up and get noticed, teachers say.

In 1997, for the first time, a higher percentage of girls (46.9%) than boys (46.8%) scored passing marks, ranging from A-star to C, on the math GCSE, according to the Department for Education and Skills. In 2003, 52% of girls and 50% of boys did so. In 2004, 53% of girls and 52% of boys. While the percentages are close, the gains are a big change from the disparity of years past, educators say. Boys did come out on top in one area: 4.5% of boys, compared with 4% of girls, still achieved the highest possible score, called A-star.

The boys' advantage didn't seem to hold up in the next level of testing. English girls now outperform boys on the "A-Level" exams taken by 18-year-olds. Comparable to Advanced Placement exams in the U.S., the A-Level tests college-level math. In 2003-4, 41% of the girls taking math A-Levels attained the highest grade, compared with 39% of the boys.

"The perception of gender differences, that math is for boys, is vastly out of proportion to any evidence for them," says Jo Boaler, associate professor of mathematics education at Stanford and a former deputy director of national mathematics testing for 13-year-olds in the United Kingdom.

Despite gains for girls in math, problems remain. Over the past decade, the number of students age 16 and up opting to take A-Level math courses after they finish the mandatory curriculum has been declining. And fewer students overall are studying math in college. The British government fears this could lead to a shortage of engineers and other technical professionals in years to come. It has tried to publicize the appeal of careers in science and math in an attempt to reverse the decline.

By contrast, in the U.S., boys still outperform girls on standardized math tests. In 2004, for instance, 9.3% of boys and 4.4% of girls scored higher than 700 out of a possible 800 on the math portion of the SAT, according to the College Board, which administers the tests taken by college-bound high-school students. Also that year, 23.5% of boys and 17.1% of girls scored a 5 on one Advanced Placement calculus test, called the AB, where scores run from 1 through 5; 43.2% of boys and 34.8% of girls scored a 5 on the even more difficult Advanced Placement calculus BC test.

In England, schools are still experimenting with ways to boost girls' classroom experience in math. Teachers at St. Luke's, a state-run school in the southwestern city of Exeter, divided boys and girls into separate math classes a few years ago and have found the results encouraging. St. Luke's teachers say they don't yet have enough exam data to prove that the new system is better, but they say more girls are speaking up in class and seem to prefer the new arrangement. Girls who have switched back into co-ed lessons often ask why they can't go back to single-sex class, teachers say.

At Uffculme School, a state school in England's rural southwest, girls in a seventh-grade math class held their own with the boys one recent morning. They answered as many questions, demanded additional explanation when confused, and nearly all raised their hands when asked whether they liked math. But when asked who wanted to go on to use math in a career, mostly boys raised their hands, shouting out plans to "cure cancer" and go into accounting.

Hamilton Community College, where Ms. Teague teaches, is located in one of the poorest parts of Leicester, an industrial town in central England that has struggled with high unemployment. The school teaches children age 11 to 16, many of whom have difficult home lives. During class time, the scruffy halls are peppered with students sent out of class for bad behavior.

Ms. Teague grew up in Surrey, a more affluent part of England, and was one of few women studying math in college. She has taught in Leicester for most of her 13-year career. When she began, she often delivered lectures from the board, she says. But, over the years, her style has evolved to include more games and interactive lessons.

"I've put a huge focus on making my classroom safe, and encouraging them all to take part," says Ms. Teague, who is 36 years old.

The walls of her classroom are covered with math jokes, images of famous mathematicians and puzzles. Colorful paper cones, cylinders and pyramids dangle from the ceiling.

In spite of all the changes in her classroom and teaching style, Ms. Teague says that the top student in her classes each year is always a boy. And she thinks that boys have a greater natural ability to do math, a view that got Harvard's Mr. Summers into hot water. Still, Ms. Teague adds that she thinks the gender gap can be closed with innovative teaching. "It's about knowing the students, their characters, what they like, and how they learn," she says.

Using the government curriculum as her model, Ms. Teague has begun incorporating more visual and hands-on materials like mathematical card games and puzzles into her teaching. Ten minutes into her 10th-grade lesson, she passed out envelopes containing cards with word problems written on them. The topic: calculating percentages. The 14-year-olds spread the cards out on their desks and started solving them on paper. "A gas bill is £43.45. [Tax], charged at 8%, is then added. Find the amount of the [tax] paid," read one of the problems. Ms. Teague walked around checking the children's work, conferring with boys and girls who seemed confused. Over and over, she reminded the children to show all their work in writing.

Ms. Teague also got the idea of using white boards from a detailed guide provided by the state. She says the boards encourage shy students, including girls, to participate more in lessons, since they can hold up their answers for only her to see, without risking the embarrassment of calling out the wrong answer in front of their peers.

During the white-board exercise, she noticed one girl slouching in her chair and not raising her board to answer questions. The girl, Ms. Teague knew, was shy and didn't like to be called on in front of her peers. So she knelt beside the girl's desk to encourage her. "I actually only said a couple of words -- 'OK, what is 43 times 4?' -- and she started writing," the teacher said later. "It was almost as if being noticed was her starting need."