This week’s award of the Nobel Prize for medicine to John Gurdon
and Shinya Yamanaka effectively recognizes the science of
epigenetics. Dr. Gurdon showed that almost any cell (in a frog)
contains all the genetic information to become an adult. What makes
the cell develop a certain way is a pattern of “epigenetic”
modifications to the DNA specific to each tissue-turning genes on
and off. Dr. Yamanaka showed that if you can remove that epigenetic
modification (in a mouse) you can reprogram a cell to be an
embryo.
Yet to most people the word “epigenetics” has come to mean
something quite different: the inheritance of nongenetic features
acquired by a parent. Most scientists now think the latter effect
is rare, unimportant and hugely overhyped.
There are several mechanisms of modifying DNA without altering
the genetic code itself. The key point is that these modifications
survive the division of cells.
This is crucial to the development of the body: It means brain
cells express different genes than kidney cells. Among the
implications: a faulty modification in the womb, caused perhaps by
maternal dietary deficiency, may condemn the baby to future
disease. Babies gestated during the “hunger winter” of 1944-45 in
the Netherlands were more likely to be obese and diabetic as
adults. Likewise, rat pups insufficiently licked and groomed by
their mothers are more likely to be stressed when they grow up.
The far bolder claim is that these modifications can be
transmitted between generations, surviving not only the normal
division of cells during growth (by “mitosis”) but the special
processing of cells that prepares them for sexual reproduction as
sperm or eggs (“meiosis”).
This theory is controversial for three reasons. First, the
entire epigenetic mechanism is normally stripped away when an egg
or sperm is made. Second, this version of epigenetics rehabilitates
the theories of the French scientist Jean-Baptiste Lamarck, who two
centuries ago postulated that traits acquired during a lifetime can
be genetically passed on to the next generation-a theory long since
buried by experiments. Third, the evidence for such a process is
sketchy-while the evidence that it has negligible impact, even if
it can occasionally happen, is strong.
Caroline Relton of Britain’s Newcastle University and George
Davey Smith of Bristol University, the editors of a recent special issue of the International
Journal of Epidemiology, conclude that epigenetic inheritance may
be a distracting wild-goose chase. Yet headlines proclaim “a turning point in our understanding of
heredity” and “why your DNA isn’t destiny.”
The evidence to back up such claims is threadbare. Frequently
mentioned is a study of a remote Swedish province called Overkalix,
which suffered famines whose effects are felt in the health of the
second generation of descendants. But the sample size is small, the
effects marginal and no specific epigenetic reprogramming has been
established as the cause. Evidence from rats is slightly better:
One study found energy metabolism in pregnant rats
affected by what happened to the parent. But the most famous animal
case, involving a mouse’s coat color tied to diabetes and obesity,
is somewhat unpredictable and untypical.
Moreover, beginning a century ago with Wilhelm Johannsen, who
coined the word “gene,” many experiments have ruled out all but the
most trivial Lamarckian effects. These now- forgotten tests
involved “pure lines” of genetically identical plants or animals.
The variation in pure lines, in weight of beans in bean plants for
example, again and again shows no heritability, ruling out
heritable nongenetic effects.
As Dr. Davey Smith puts it: “The conclusion from over 100 years
of research must be that epigenetic inheritance is not a major
contributor” to physical resemblance across generations.