Eye-based Paternity Testing & Other Human Genetics Myths6 min read
In 2001, scientists announced an incredible accomplishment: they had completed the sequence of the human genome. The complete instruction book for making a human being spans 24 chromosomes and is 3.2 billion letters long. That’s about 1,000 times the length of the first ten Wheel of Time books put together. Sequencing the whole thing had taken ten years and something like eight billion dollars.
That’s a considerable investment for the taxpayers, but the scientists made incredible promises. They said it would be the scientific breakthrough of the century. With the sequence of the genome in hand, they promised to dramatically improve the prevention, diagnosis, and treatment of disease. They told us the completion of the human genome would mark a new era for human health.
Well, that’s not entirely fair. Finishing the genome was the starting point in a long journey to understanding how our genes make us who we are. The more they study it, the more scientists have found that the genome is incredibly complex. I know, because I’m one of them. I work as a human geneticist at one of three large-scale DNA sequencing centers in the United States.
Unfortunately, few things about genetics and inheritance are straightforward. They’re certainly not as simple as we often see them portrayed in books, movies, and other media. As a scientist who also enjoys science fiction, I often encounter popular misconceptions about how genetics actually works. Here are a few of the more common (and inaccurate) tropes:
- The eye-based paternity test
Oh, if I had a dime for every time a character recognized a long-lost parent or sibling based on eye color, a widow’s peak, a peanut allergy, or some other physical quirk. Sure, first-degree relatives do tend to look alike, and many visible traits tend to run in families. Yet they should not be used to establish (or disprove) kinship because it’s not that simple.
Eye color, despite the common wisdom suggesting otherwise, is a complex inherited trait. While it’s true that blue eyes tend to be recessive and brown eyes tend to be dominant, eye color is a spectrum, not a multiple-choice test. The color of the iris is determined by the amount of melanin in it, and that can be influenced by as many as 10 different genes. Brown-eyed parents can have blue-eyed children and vice-versa. Also, eye color can change: many newborns have blue eyes that become brown or green during early childhood.
Please, don’t rely on physical characteristics to tell who’s related to whom. The inheritance of such traits does not always follow a predictable pattern. Even when it does, in real life, these kinds of tests might uncover secrets that were better left buried.
When we do genetics studies of families, we verify the expected relationships as a quality control step. About 4% of the time, there’s a discrepancy (most often, the reported father is not the biological father). This observation holds true across racial groups and socioeconomic strata, and has been consistently reported by many researchers for over a decade.
We call these “non-paternity events” and, generally speaking, we don’t report them back to the study participants.
- Different people with different genes
Often I hear people discussing how someone has “the gene” for some trait or ability. Alternatively, an elderly person in good health is often said to have “good genes.” In truth, we all have the same set of about 20,000 genes. In very rare cases, large segments of the genome can be deleted (which removes genes), and usually that’s a very bad thing. So the concept of people having “different genes” is not accurate. It’s the genetic variation within and around genes that accounts for the differences between us.
That said, I recognize that most people use the term genes colloquially. I don’t expect people to start saying, “So you’re 95 years old? You must have a really good set of genetic variants in your genome.”
While we’re on the topic, I should tell you that traditionally defined genes—that is, things that code for proteins—occupy only about 2.5% of the human genome. Non-coding sequences make up the rest of it. Some of them may regulate when or how much certain genes are turned on, or help organize the genome inside the cell. Still others provide physical structures that serve another purpose, such as the repetitive sequences that make up the telomeres (ends) of chromosomes.
But much of the genome either has no specific function or serves a purpose that we haven’t yet uncovered.
- Your genetic destiny is written
GATTACA became one of my favorite science fiction movies long before I entered the field of genetics. It portrays a near-future dystopian society in which the worth and future potential of an individual are determined, at birth, with a genetic analysis. As a result, most parents take advantage of genetic selection/enhancement of embryos to get the ideal combination in their future child. These designer babies get the cool jobs, whereas babies born without such intervention are basically treated as invalids.
On the bright side, the idea of sequencing every person’s genome at birth is rapidly becoming more plausible. Thanks to the advent of “next-generation” DNA sequencing technologies, we can now sequence a human genome in less than a week, for a little over a thousand dollars. We can use that information to infer a lot about a person, such as ancestry, risk for certain diseases, and likely physical appearance. But we’re a long way off from predicting the lifetime risk for common diseases, like heart disease, diabetes, and psychiatric disorders.
Most of these result from complex interplay between genetic, lifestyle, and environmental factors. The vast majority of genetic variants associated with disease risk have a very small effect: they might increase your risk by 5%. There could be thousands of such genetic factors for any given disease, so predicting someone’s health at birth, even if we knew everything about the genome, would be a very complex problem.
One thing I particularly admired about GATTACA was how the protagonist’s genetic future was described in probabilities: neurological disorder, 60%; ADD, 89%; heart failure, 89%. There are few certainties in human genetics, and the movie did well to acknowledge this.
- Mutations are awesome
Mutations, or acquired changes in DNA, are one of the most misunderstood topics in genetics. Too often in science fiction, I see mutations treated as good or advantageous things. A telling example comes from the movie Resident Evil, in which the Red Queen (a sort of malicious AI in control of things) releases a genetically engineered monster that attacks the group of heroes. After it makes a kill, the Red Queen says that after it feeds, it will mutate, and then become something new. Presumably, an even stronger, deadlier monster.
The reality is that mutation, for humans at least, is uncommon. Most of the genetic variation that we have, we inherited from our parents. New mutations that arise in a child but are absent from both parents are extremely rare. We’re talking about 40 or 50 throughout the entire genome, compared to 3 million inherited genetic variants.
Generally speaking, new mutations are not beneficial. The human genome has been under natural selection for thousands of years. Think of it like a Formula One racecar. Mutations are like metal screws that you add (or remove) at random. More than likely, this won’t have any effect on the racecar, but if it does, you’re far more likely to break something than to make it better.
The body’s cells also acquire mutations over time, sometimes by chance as cells divide, but also through DNA damage induced by radiation or carcinogens. Most cells that suffer damaging mutations will die. Occasionally, however, a cell gets the right set of mutations that allow it to grow and divide uncontrollably. When this happens, cancer is the result.
- Genetic blame and inevitability
I think that the most common myth about human genetics is that most traits are inherited in simple and/or inevitable fashion. The genetics taught in most high school biology classes—like dominant, recessive, and X-linked inheritance patterns—may be partially to blame for this. Mendel’s laws and Punnett squares (remember those?) only work for rare genetic conditions that are due to mutations in a single gene. Cystic fibrosis and sickle-cell disease, for example, are recessive disorders caused by mutations in the CFTR and HBB genes, respectively.
Although Mendel’s laws offer a useful introduction to genetic inheritance, they become problematic when we try to apply them to more complex traits. In fiction, I often meet characters living under a specter of a disease that killed their grandparents and/or parents. It seems inevitable that they, too, will fall victim to it.
Alcoholism, for example, is a complex disorder that’s often treated simplistically: “My dad was an alcoholic, so I became one.”
I’m sorry to have to tell you this, but most of the traits that make for interesting characters—intelligence, attractiveness, physical/mental health, etc.—do not follow simple laws of inheritance. They might not be passed from parent to child, or shared by siblings. The genetics underlying these characteristics will undoubtedly be complicated.
Just like we are.