BRAINWELL BLOG

Airwaves and Brainwaves: Does TV Really 'Rot' Your Brain?

Living in what some have dubbed the «New Golden Age of Television", many of us are spending our evenings on the couch, binge-watching one of the multitude of critically-acclaimed shows that light up our televisions, tablets, and phones on demand. It’s no surprise that in the era of Mad Men, Breaking Bad, and The Walking Dead, we find more and more excuses to spend an evening glued to the screen. But what is the long term cost? Could the new golden age of television spell trouble for us in our golden years?

Parents have long been warned of the «brain rotting" danger of letting their children spend too much time beneath the glow of a television set. In one of the first studies to use modern neuroimaging techniques to examine the neurological effects of TV viewing habits, researchers at Tohaku University in Japan recently linked high levels of television viewing in childhood with poor cognitive performance and increases in orbitofrontal cortical thickness. Readers should note that increased cortical thickness, in this case, is not a good sign. A healthy, developing cortex generally becomes thinner throughout childhood as it prunes synapses in an effort to become a more efficient computational machine. So, while «rotting" may not be an apt term, TV does seem to interfere with healthy brain development.

But we’re grown ups! Our brains have long passed the developmental stages of childhood and adolescence, so we needn’t feel guilty about our plan to spend the holiday season finally catching up on Game of Thrones. Or do we?

Tina D. Hoang, M.S.P.H. and Dr. Kristine Yaffe have led a multidisciplinary team of researchers who have spent the last quarter of a century studying this question.

The researchers followed 3247 participants from 1985 to 2011, collecting data on television viewing habits and physical activity. After 25 years of data collection, the participants underwent a battery of neuropsychological tests.

The results--released this week in JAMA Psychiatry, a journal published by the American Medical Association--were clear. Those participants who spent their 20s and 30s watching a lot of television (3+ hours a day) showed significantly worse cognitive performance in their 40s and 50s than their non-couch-potato counterparts on tests measuring processing speed and executive functions. Young adults who had both low levels of physical activity and high levels of television watching fared even worse: being twice as likely exhibit poor cognitive performance at their mid-life assessment than their highly active, TV-avoiding peers.

This finding comes on the feet of an earlier study out of Case Western Reserve University that linked television viewing in mid-adulthood with increased risk for Alzheimer’s disease in late life. In this study, researchers reported that each additional daily hour of television viewership was associated with a 1,3x higher risk of Alzheimer’s.

These findings make it clear that at any point in life, excessive TV watching can have detrimental, long term effects on brain health. However, the scientists do point out that TV itself is probably not the real culprit. Rather, high rates of TV viewership are often indicative of a more general problem: a lifestyle deficient in intellectually and socially stimulating activities.

So don’t cancel your Netflix subscription just yet. Although excessive TV watching should be avoided, spending a few hours each week (or even the odd binge night) watching your favorite shows isn’t a cause for concern so long as you fill your daily life with activities that are physically, socially, and intellectually stimulating.