The Scientific Evidence

At the outset, it is important to determine whether, as a matter of science, one's genotype really does unalterably predetermine one's physiological, let alone behavioral, future. To some extent, the language and symbols of the debate have been captured by the popular culture and the salesmanship or grants-manship of the genetics community. Thus, the Human Genome Project has spawned images of the "Rosetta stone" and "the holy grail of biology," and the individual's genome as his or her "coded future diary" (see generally Hubbard and Wald 1993; Shuster 1992).

There is no scientific evidence to support such absolutist, deterministic views of the role of genes, and there is quite a bit of evidence to the contrary. The closest association between genotype and phenotype is in the monogenic disorders, the classical, Mendelian, genetic diseases caused by a single gene, such as Tay-Sachs disease, Duchenne muscular dystrophy, and Huntington disease. Even in individuals who have the allele for a single-gene disorder, such as myotonic dystrophy, geneticists cannot say with certainty whether the individual will be affected, what the age of onset will be, or how severe the condition will be. Huntington disease was long used as an example of a disorder with complete penetrance, but new evidence casts doubt on this conclusion (Ru-binsztein et al. 1996; Nance 1996).

Several genetic principles contribute to this imprecision, including the following: variable penetrance (the likelihood that a genotype will be expressed as a phenotype), allelic heterogeneity (the varieties of mutations of the gene, such as the more than 600 mutations of the cystic fibrosis gene), variable expressivity (the range of severity of the condition if it is expressed), imprinting (variations in the phenotype depending on whether an allele was inherited from the mother or father), allelic expansion (the tendency of some trinu-cleotide repeats to increase in the number of repeats in a succeeding generation), and the rate of spontaneous mutations (the probability that an individual will be affected without inheriting an aberrant gene from either parent).

Polygenic disorders are caused by the interaction of two or more genes. A variety of metabolic and other disorders are thought to be polygenic. Multi-factorial or complex disorders, such as many cancers, are thought to be caused by genetic and environmental factors, acting either individually or, more commonly, in combination. As to vast numbers of polygenic and complex disorders, the predictive value of the presence of a single mutation can only be ex pressed as a probability, and it may never be possible to determine the precise effect of genetic factors with great certainty.

The final area of scientific inquiry, behavioral traits, is the most contentious. There are several scientific obstacles to correlating genotype and behavior. One problem is in defining the end point, whether it be schizophrenia or intelligence. Another problem is in excluding other possible causes of the condition, thereby permitting a determination of the significance of any supposed correlation. Much of the research today on genes and behavior engenders very strong feelings, because of the social and political consequences of these supposed truths. Thus, more than any other aspect of genetics, discoveries in behavioral genetics should not be expressed as irrefutable until there has been substantial scientific corroboration.

Prior scientific assertions of the genetic domination of behavior have not stood the test of time. In 1865 Galton published two papers in MacMillan's Magazine titled "Hereditary Talent and Character." Galton began part two of his article by stating: "I have shown, in my previous paper, that intellectual capacity is so largely transmitted by descent that, out of every hundred sons of men distinguished in the open professions, no less than eight are found to have rivaled their fathers in eminence. It must be remembered that success of this kind implies the simultaneous inheritance of many points of character, in addition to mere intellectual capacity. A man must inherit good health, a love of mental work, a strong purpose, and considerable ambition in order to achieve successes of the high order of which we are speaking" (Galton 1865, 318).

In retrospect, Galton's methodology for reaching his conclusion was absurd. He determined that between 1453 and 1853 a total of 605 "notables" lived. He then explored their relatives and found that there were 102 "relationships" among the "notables," such as father and son. The mere existence of so many familial associations meant, ipso facto, that talent and character were hereditary. In my cursory review of the 605 "notables" identified by Galton during this 400-year period, it appears that the list primarily (or perhaps exclusively) consists of white Europeans and their descendants (including John Adams, John Quincy Adams, and Samuel Adams from the United States). Furthermore, the notables were almost exclusively male, virtually all Christian, and overwhelmingly British. This must have been a comforting but not surprising discovery in Victorian England.

Galton also turned his keen eye to the United States. He observed that the "North American people has been bred from the most restless and combative class of Europe" (Galton 1865, 325), based on their willingness to flee their native country and seek a better life in the United States. "If we estimate the moral nature of Americans from their present social state, we shall find it to be just what we might have expected from such a parentage. They are enterprising, defiant, and touchy; impatient of authority; furious politicians; very tolerant of fraud and violence; possessing much high and generous spirit, and some true religious feeling, but strongly addicted to cant" (Galton 1865, 325). It is interesting to compare Galton's "scientific" insights, which are thoroughly explained in the two sentences quoted above, with Alexis de Tocqueville's astute personal observations, which required the four volumes of Democracy in America to express.

By today's standards, Galton's methods and conclusions are ludicrous. Yet, in his day and for decades thereafter, his research was considered unimpeachable. Lewis Terman published a five-volume study in 1917. Using the Stanford-Binet IQ test, Terman attempted, retrospectively, to measure the IQ of some of the greatest figures in history. Francis Galton was posthumously assigned an IQ of 200 for his pioneering work in psychology, although Galton's contributions in forensics, meteorology, and statistics have stood the test of time better (Gould 1981, 184).

In recent years, the most controversial application of genetic determinism has been The Bell Curve, published in 1994 by Richard J. Herrnstein and Charles Murray. According to the authors, "IQ is substantially heritable. . . . The genetic component of IQ is unlikely to be smaller than 40 percent or higher than 80 percent" (Herrnstein and Murray 1994, 105). A meta-analysis of 200 familial IQ studies, published in 1997, however, estimated "broad sense" heritability at 48 percent and "narrow sense" heritability at 34 percent (Devlin et al. 1997). Some experts, such as Howard Gardner (1983), question whether there is such a thing as general, measurable, innate intelligence; other experts argue, in effect, that even if there is, its significance is greatly overestimated (see, e.g., Andrews and Nelkin 1996).

In his popular 1995 book, Emotional Intelligence, Daniel Goleman describes findings from the famous marshmallow study at a preschool at Stanford University in the 1960s. Psychologist Walter Mischel had experimenters tell individual four-year-olds in preschool: "If you wait until after I run an errand (which took about 15 minutes), you can have two marshmallows, but if you can't wait, you can have one and only one marshmallow now." A third of the children grabbed the marshmallow right away, the others waited. When the tested children were evaluated as they were graduating from high school, those who had waited turned out to be substantially superior as students than those who did not. At age four, the ability to delay gratification was twice as powerful a predictor of future SAT scores than was IQ (Goleman 1995,81-82). Gole-man also reviews other studies suggesting that in older students optimism and hope are stronger predictors of success than IQ.

The Bell Curve is not controversial because, as a psychological treatise, it underestimated the predictive power of marshmallows. It is controversial because it is a political manifesto. According to the authors, not only is IQ real and inherited, but there is an innate difference in IQ based on race, with whites having IQs one standard deviation higher than blacks. "Inequality of endowments, including intelligence, is a reality. Trying to pretend that inequality does not really exist has led to disaster. Trying to eradicate inequality with artificially manufactured outcomes has led to disaster" (Herrnstein and Murray 1994, 551). These "disasters" include much of what can be characterized as the liberal welfare state, including antipoverty programs, welfare, education, affirmative action, and other aspects of daily life. R. Grant Steen bluntly concludes that The Bell Curve is "a political agenda masquerading as science, a mean-spirited diatribe against the poor and disenfranchised, and a pseudointellec-tual legitimization of racism" (Steen 1996,113).

The Bell Curve epitomizes biological determinism, and biological and genetic determinism naturally lead to a political philosophy. According to Lewontin et al.,"the presence of such biological differences between individuals of necessity leads to the creation of hierarchical societies because it is part of biologically determined human nature to form hierarchies of status, wealth, and power" (Lewontin et al. 1984, 68). Thus, genetic determinism is the scientific justification for societal inequality, social Darwinism, and the status quo.

Flawed scientific theories can be refuted by more rigorous science. A more perplexing social problem involves the permissible societal response to legitimate discoveries in behavioral genetics. Undoubtedly, there is some correlation between certain genes and behavioral traits (see, e.g., Sherman et al. 1997). The only serious scientific dispute concerns the overall degree of correlation and the applicability of genetic factors in a range of specific behavioral traits. What, then, are the likely psychological, social, political, and legal consequences of such correlations?

As an example, take the case of alcoholism. Several past and ongoing studies have explored whether there is a genetic component to alcoholism. Assume there is such a component in some cases of alcoholism. Does that mean that, as a society, we will be more or less tolerant of alcoholics, more or less inclined to mandate genetic testing for such an allele or alleles, or more or less likely to embrace the disease model of alcoholism? On the one hand, it could be argued that the genetic component vitiates the moral taint from individuals with alcoholism. On the other hand, the genetic, heritable nature of the disorder may increase the stigma associated with alcoholism; it may increase the pressure for genetic screening for the mutation; it may contribute to individuals feeling a sense of resignation and a reluctance to enter treatment; and it may lead to disdain for individuals who, despite knowledge that they have the mutation, proceed to drink nonetheless. Research to find an association between genes and alcoholism is being conducted at the Ernest Gallo Clinic and Research Center at the University of California-San Francisco (Miller 1994). If a genetic link to alcoholism were to be established, some of the social pressure against alcoholic beverages and their purveyors might be deflected onto "faulty" genes.

Similar issues are raised with regard to a possible genetic link to homosexuality. If we find a "gay gene," will it mean greater or lesser tolerance? My suspicion is that it will not change the way most people view homosexuals. For individuals who are tolerant of homosexuals, it will reaffirm that the behavior is physiologically based and does not represent moral depravity. On the other hand, for individuals who are intolerant of homosexuality, it will confirm their view that such individuals are "abnormal." It also could lead to proposals that those affected by the "disorder" should undergo treatment to be "cured" and that measures should be taken to prevent the birth of other individuals so afflicted.

Complex social questions are posed by nearly all of the reported or imagined possible discoveries in behavioral genetics. Issues such as drug dependence, violence, and personality traits all may be viewed through a genetic lens. Some scholars are even studying whether economic behavior (Wheeler 1996) and legal doctrines (Berkman 1997) are biologically based.

Let me explore one example of the dangers of behavioral genetic determinism. In late 1995, the New York Times published an article discussing the findings of researchers at Johns Hopkins University (first published in the journal Nature) (Nelson et al. 1995) that male mice specifically bred to lack a gene essential for the production of nitric oxide, a molecule that allows nerve cells to communicate, are relentlessly aggressive against fellow males, often to the point of killing them (Angier 1995). They are also sexually aggressive with fe male mice. The question immediately raised was whether a similar finding was possible in humans, which would genetically account for violence. Dr. Solomon Snyder, the lead author of the study, was quoted as saying that they planned to pursue the possibility that the nitric oxide synthase gene was involved in some small percentage of human aggression. He said it would be a relatively straightforward matter of looking at certain populations, like the mentally ill or the imprisoned, to screen for defects in the gene.

At a time when there is justifiably widespread concern about violence, the reductionist and determinist view that a single gene is responsible for some percentage of violence, and that the mutation can be screened for, is very appealing. However, it is seductively misleading and threatening. On the basis of a single study on rodents, the researchers were apparently prepared to test their findings on the most vulnerable groups, seemingly without concern for the ethical and social issues raised by such research, and ignorant of the strict limitations on research involving prisoners and individuals who lack the mental capacity to give informed consent.

Not long after the Nature article, a much less publicized article appeared in the Journal of the American Medical Association (Needleman et al. 1996). The article reported a study of 800 boys attending public schools in Pittsburgh. According to the authors, the leading predictor of aggressiveness and delinquency in the boys studied was the level of lead in their bodies, which was attributable to environmental pollution, ingestion of lead, and other sources. Apparently environmental causes of violence, even those subject to remediation, are less exciting to the public than purported genetic causes.

Even for purely physical disorders, it is important to recognize that a genetic prognosis in the absence of a treatment or cure may have substantial negative social consequences. Are we, as a society, going to be drawn into two camps by genetic testing? In the first group would be the "worried well"— individuals at risk of a future genetic disorder who may never become ill but to whom every cough is the first sign of lung cancer or every dropped paper clip is the start of an irreversible neuromuscular disorder. The other group would be composed of risk takers, who, fearing the inevitability of their demise, embark on sky diving and alligator wrestling. And how would you know whether you will be in the first or the second group? As reported in the journal Nature Genetics in 1996, researchers have discovered a genetic explanation for risk aversion or risk taking (Epstein et al. 1996). Thus, inevitable behavioral genetics will determine how we respond to inevitable physical genetics.

Should we accept these scientific assertions without asking hard questions? As Dreyfuss and Nelkin point out, the image of absolute neutrality that science historically has sought to project is not in accord with experience. "The history of science is replete with cases where the choice of research topics, the nature of scientific theories, and the representation of research results are socially structured, and shaped by cultural forces, to reflect ... assumptions of particular societies at particular times" (Dreyfuss and Nelkin 1993,339-340). The history of behavioral genetics also is replete with retracted or unreplicated studies (Detera-Wadleigh et al. 1989; Kelso et al. 1989) and the misuse of established data.

Before leaving the topic of nature versus nurture, it is important to note that not all observers are comfortable with a bipolar model. Theologian Ted Peters argues that the problems of genetic determinism are not eliminated but merely replaced by embracing environmental or cultural determinism. Either form of determinism, or a combination of the two, is fatally flawed in his view, because both types of determinism overlook the theological and spiritual significance of God-given human freedom (Peters 1997). Yet, even free will has been ascribed a genetic basis. According to philosopher Evan Fales, "just as an incapacity to reason or make choices can be (and sometimes is), unfortunately, genetically ordained, so too it is genes that ordain the sort of brain design that, in humans, is a necessary condition for the capacity to reason well and to freely choose" (Fales 1994, 57-58).

Alcohol No More

Alcohol No More

Do you love a drink from time to time? A lot of us do, often when socializing with acquaintances and loved ones. Drinking may be beneficial or harmful, depending upon your age and health status, and, naturally, how much you drink.

Get My Free Ebook


Post a comment