Oh L’Amour

Peter and I thoroughly enjoyed the HBO Max show It’s a Sin this weekend. Set in London in the 1980s, it follows a group of young gay men as they come out of the closet and are beset by AIDS. Not every character in the show is gay. Parents play key roles, and much of the action is seen through the eyes of a young woman, apparently straight, who is a friend and roommate. I think my favorite character in the show was Colin, a phlegmatic, responsible, incontrovertibly gay but decidedly unflashy young man from Wales, who works as an intern at a Savile Row haberdashery and seems too cautious to act on his sexual impulses. Most of the characters are types rather than individuals, in the way of TV shows, but Colin was a type I hadn’t seen on screen before, so I never knew what he was going to do next, which I liked. The show hits a few wrong notes (which I’ll get to), but I laughed and cried and at times found it almost overwhelmingly evocative.

The soundtrack has a lot to do with how evocative. Soft Cell, Yazoo, Erasure, Wham, Kate Bush, the Eurythmics, and of course PSB: in my recollection, these were in fact the songs that we were listening to. “Music was so much more fun then,” I muttered to Peter, early in the first episode, as the middle-aged are wont to do. Artificial, calculated, trivial, effeminate, decadent songs. More pleasurable than songs should have been, and pleasurable to a part of me that I felt I probably wasn’t supposed to be indulging. Since a few of the show’s actors speak with strong British accents and a certain amount of British slang, to us impenetrable, Peter and I watched with the subtitles on, which meant that when the closed-captioner mislabeled a cover of the song “I Feel Love,” Peter muttered, “That’s not Donna Summer. It’s Bronski Beat.” And when the closed-captioner characterized the intro to a Pet Shop Boys song as “melodramatic music,” I harrumphed. (I mean, yes. But . . .) Peter effortlessly remembers the release dates of almost all these songs and reports that the songs keep pace with the imagined moment of the TV show’s story with an almost military rigor.

That imagined moment doesn’t exactly match the timing of my own debut into gay life. The show begins in 1981 and ends in 1991; I came out in 1989. Some amateur, armchair sociology: apps have probably changed everything now, but in the old days, gay bars were the stage for the public drama of our lives, and my sense back then was that as a general rule, one got to tread that stage from roughly age 20 to 35, making a gay man’s generation a rather short fifteen years. I overlapped a little with the generation portrayed in the show, and I knew many people who were in that generation, but I wasn’t quite in it myself. And in terms of the history of AIDS, the half-generation separating us was crucial. Before I came out, I knew that AIDS was a lethal sexually transmitted disease caused by HIV, and I knew that I could lower my risk significantly by using condoms or modifying what I did in bed. When the characters in the show started having sex with one another, on the other hand, not even scientists knew any of that. My gay generation lived in the shadow of AIDS, and it shaped nearly everything about our romantic and sexual lives, but we did not bear as heavy a burden of illness and death as the men who preceded us.

Which doesn’t mean we bore no burden, nor does it mean we weren’t completely petrified. When the central gay character in the show, a charming, fey, bumblingly but also ruthlessly narcissistic aspiring actor named Ritchie Tozer (played by Olly Alexander, who in real life is a pop star who has collaborated with the Pet Shop Boys), neurotically examines himself for spots and sores, I identified, uncomfortably. Also, sitting in a clinic waiting for one’s name (or pseudonym) to be called, terrified to hear the results of one’s HIV test: twenty-five years later it’s still a little too real and maybe also too soon? Not all the memories that the show brought back were grim. I also recognized the way some of the characters went to bed with each other as a way of becoming friends, a lighthearted aspect of gay life that the HBO show Looking largely overlooked in its focus on the drama of Finding the Right Man. I hope that in the age of apps, hook-ups still sometimes undergo that kind of evolution. It would be regrettable if the freemasonry of pleasure were to give way to an assortative rationalization of it (he says from the safe, ignorant harbor of middle-aged monogamy).

To add a little amateur history to the amateur sociology: The hinge for my gay generation came in 1996, when researchers proved that treating HIV with three different antiretroviral drugs at the same time boxed the virus into a metaphorical corner that it couldn’t mutate itself out of. In a wealthy country like America—or, rather, in zip codes in a wealthy country like America where people could afford healthcare—everything about AIDS changed once this treatment became available. Life expectancies dramatically lengthened, and the reigning social understandings of the disease were transformed. From the virus’s perspective, unfortunately, the new treatment figured only as a speed bump, if that. The rate of HIV infection in the U.S. population has only declined moderately since triple-drug treatment debuted. Given current rates of infection, one out of six gay or bisexual men in America will still contract HIV during his lifetime. Meanwhile, AIDS remains the leading cause of death in several countries in sub-Saharan Africa and death rates from the disease also remain high in Russia, Thailand, and parts of the Caribbean and South America.)

The show ends, as I said, in 1991, so that, medically speaking, it depicts a world where the cavalry never arrives. (It’s thus a big weepie, be forewarned.) It’s maybe irrecoverable now how potent the fear of death combined with the centuries-old stigma around homosexuality then was. A few instances of ostracization are dramatized in the show, but what maybe can’t be put on screen is the subsonic rumble of silence, the rarely voiced distaste of the larger society. (It wasn’t until June 15, 1987, for example, that the New York Times allowed the use of the word gay as a value-neutral synonym for homosexual. In the mid 1990s, a boyfriend of mine kept on his wall a front page of the New York Times that he had had framed because it used the word three different times, which impressed him as a milestone.) Many times, in the past twelve months, I have had the morbid thought how much nicer it is to go through a plague that straight people are dying in larger numbers from, too. It’s a relief not to have to worry about upsetting them if one fails to hide or downplay sufficiently one’s fear or grief. It has been liberating to listen to them loudly and confusedly arguing about what matters more—the meanings and pleasures available through human contact, or the safety that can be afforded by isolation. There’s an enjoyable perversity in noticing that people who prioritized safety over contact during the AIDS epidemic were tagged as conservative, while those who prioritize safety during COVID-19 are thought of as liberal. In It’s a Sin, a distraught mother exclaims that society would react very differently if there were a disease killing as many straight men as AIDS was killing gay ones. The claim isn’t speculation any more; as of this past year, it’s proven historical fact.

I seem to have got distracted from the criticism that I was going to make, but maybe in the end it’s more an observation. The generation who hit the gay bars in 1981 was to a startling extent wiped out. If you were to try to sell a nostalgic TV show just to them, you’d have almost no audience. The scriptwriters of It’s a Sin have understandably made a few adjustments. One is putting at the center of the story not a gay man but a young woman named Jill, who ends up being more dedicated to AIDS activism than most of her gay male friends. That in itself isn’t a distortion of the historical evidence. There were many such women in real life, many of them lesbian (Jill’s sexuality is left undefined). (Full disclosure: though I had friends in the activist movement, and felt sympathetic with it, I never took part myself.) Another modification is a shift in political sensibility. In one scene, when a somewhat dotty neighbor volunteers that she thinks AIDS victims are angels in disguise, one of the gay characters is so outraged that he throws a traffic barrier through her shop window. That felt like an off note. The movement’s displays of anger were almost always directed toward authorities and institutions—government, churches, pharmaceutical companies—and they were strategic, planned carefully in advance. To meet misguided sympathy with violence seems more a Twitter kind of mood—a retrojection. Similarly, in another late scene, a character berates a dead gay man’s mother, blaming on her all the shame and stigma that gay men dying of AIDS have been made to carry. I’m afraid I cringed. It sounded to me like something a teenager might wish he could say—or something a TV producer might imagine a teenager wishes to say—but not like the sort of thing that one grieving adult would say to another, no matter how misbegotten the other person’s understanding of sexuality might be. It’s a common move nowadays on social media to write people off, but one of the brilliant tactics of the AIDS activist movement was to engage with opponents, win their respect, and sometimes turn them into allies. (Cf. Larry Kramer and Anthony Fauci.)

A few early, expensive comments by the novel about the other talent in the room

I recently read a stack of Elizabeth Taylor’s novels, in preparation for writing an introduction to her novel A Game of Hide-and-Seek, to be reprinted as an NYRB Classic in February. In one of her later novels, In a Summer Season, published in 1961, there is a description of the advent of television into an upper-middle-class British household, which strikes me as an early attempt by the novel to reckon with its new rival. I wonder if anyone has made a collection of such scenes.

The book’s heroine is named Kate. Her adult son Tom lives with her, as does her second husband, Dermot, and an aunt named Ethel who tries to keep out of everyone’s way. The two men are to a certain extent allied against Kate in not-quite-reputable habits of leisure. Their unseemly alliance culminates in Tom’s purchase of a television set.

“Why can’t you read a book instead?” Kate asked [Tom]. She disdained such ways of passing time, not realising that she very seldom read herself these days and was just off for an evening in the pub with Dermot. Tom kept the television set in his bedroom and he and Dermot liked to sit there with curtains drawn against the sunshine, watching cowboy films. “Too good an evening to waste out of doors,” Dermot would say, taking a last glimpse out of the bedroom window while the set was warming up—Ethel’s dog lying on the hot gravel down there, a column of gnats dancing in the shaft of light under some trees and high above the trees some cirrus clouds paling and dissolving. “Right!” Tom would say, drawing up two uncomfortable bedroom chairs. On the screen, rods of light ran blindingly into one another, the picture steadied until they were able to see a packet of soap powder capering on tiny legs, singing a ditty. It then took a dive into a washing-machine, and a head of jostling bubbles, singing too, rose up.

The scene visible through the window is quiet enough to lend itself to a meditative description; the screen, by contrast, is too loud to meet with anything but sarcasm. But even more telling is the perfidy of Ethel, who means to resist but isn’t strong enough to.

Sometimes Ethel joined them, looking in with a trivial excuse, begging them not to stir and lingering to watch, but as if her attention was only momentarily caught. After hovering for a while, she gradually merged into the shadowy background and was forgotten, until at last, with sick-bed caution, she tiptoed away. Like hares before a serpent, Tom and Dermot sat rigid and in silence. From time to time, their hands groped on the floor for their glasses of light ale, their cigarettes burnt to their fingers.

A chapter or so later, after deprecating the television in a gossipy letter to a friend, Ethel is mesmerized by a broadcast of Swan Lake on ice.

Does television impair intellect?

As I explained in an earlier post, my review-essay “Twilight of the Books” appears in the 24 December 2007 issue of The New Yorker, and as an online supplement, I’m summarizing some of the data that I drew from, organizing the summaries by topic, and including links where I can. These are merely evidence in raw form and are probably a bit indigestible taken en masse. For analysis and discussion and hopefully a more pleasant read, please see the New Yorker article itself.

Yesterday: Is literacy declining? Today: Does television impair academic performance and cognitive development?

There is little doubt that television is on average bad for a person’s intellectual development. If you read through the studies below, however, you will see that there’s some dispute about whether a small dose of the right kind of television at the proper age might be beneficial.

  • A 2001 meta-analysis of data on more than 1 million students found “little room for doubt concerning the negative nature of the overall linear relationship between television viewing and educational achievement.” However, the analysis also suggested that the relationship was not best graphed as a straight line but rather as “an inverted check mark shape,” also called a curvilinear graph. That is, for each age, there is an optimal viewing time, up to which point television viewing is beneficial, and above which it is harmful. The author, Micha Razel, found that this optimum was 2 hours a day for nine-year-olds, 1.5 hours a day for thirteen-year-olds, and 0.5 hours a day for seventeen-year-olds. The benefit of the optimal viewing time decreased with age. Razel found that 55 percent of the students in the data set were exceeding their optimal viewing time by 3 hours a day, and that this excess viewing was lowering academic achievement by about one grade level. Razel speculated that “the finding of a larger optimal viewing time for younger children may be related to their higher quality viewing,” that is, to the probability that they were watching educational programs under parental supervision. [Micha Razel, “The Complex Model of Television Viewing and Educational Achievement,” {link to citation only} Journal of Educational Research, July/August 2001.]
  • In a summary of pre-2002 research, two authors concluded that “educational television has a substantial positive impact and that entertainment television has a negative impact.” The benefit of shows such as Sesame Street has been extensively documented, the authors wrote. One study found that boys who watched educational television at age five had higher grades even in high school (the effect for girls was not statistically significant). On the other hand, a study of the introduction of television into Canada found that its arrival lowered the reading scores of second graders. The positive impact seems to be limited to educational television watched during the preschool and early elementary-school years, and above an optimum level, television watching hampers academic achievement, in a so-called curvilinear graph. Does television displace reading? Educational programs encourage reading, but entertainment programs do displace it in the early, “decoding” stages when reading is an effortful activity. Once reading habits are established, television seems not to affect them. The authors were not persuaded that television shortened attention spans, interfered with homework, taught children to expect all learning to be effortless, or was inherently disabling of cognition. [Marie Evans Schmidt and Daniel R. Anderson, “The Impact of Television on Cognitive Development and Educational Achievement,” Children and Television: Fifty Years of Research, {book available for purchase} Lawrence Erlbaum Associates, 2007.]
  • In a longitudinal study of 330 German kindergarteners and second graders, who were assessed through time-use diaries and achievements tests between 1998 and 2001, researchers found that children who watched entertainment television scored significantly lower on tests of phonological awareness, reading skills, and general achievement, even if their viewing was relatively light. The effect increased as time passed, so that by the third grade, heavy viewers were between one and one and a half years behind light viewers in reading scores. Since light viewers outperformed medium viewers, who in turn outperformed heavy viewers, the study seems to contradict the notion that the relation between television viewing and academic achievement is “curvilinear,” that is, that a certain moderate amount of television is beneficial. (Note that heavy viewers in Germany would be classified as normal viewers in America.) In the German study, the correlation between educational television and test-score gains ranged “from insignificant to moderately positive”—much less substantial than American researchers have found. The researchers controlled for IQ, socioeconomic status, and early literacy. [Marco Ennemoser and Wolfgang Schneider, “Relations of Television Viewing and Reading: Findings from a 4-Year Longitudinal Study,”{link to citation only} Journal of Educational Psychology, 2007.]
  • A survey of 1,008 parents in February 2006 found that babies who were eight to sixteen months old knew six to eight fewer words for each additional hour of baby DVDs and videos watched daily. [Frederick J. Zimmerman, Dimitri A. Christakis, and Andrew N. Meltzoff, “Associations between Media Viewing and Language Development in Children under Age 2 Years,” {link to citation only} Journal of Pediatrics, August 2007. Alice Park, “Baby Einsteins: Not so Smart After All,” Time, 6 August 2007. Lisa Guernsey, “The Genius of ‘Baby Einstein’,” New York Times, 16 August 2007.]
  • Through a re-analysis of longitudinal health data for 1,278 children who were seven years old between 1996 and 2000, researchers found that hours spent watching television at ages one and three correlated with a higher likelihood of attention disorder at age seven. The correlation was present even when the data were controlled for factors such as mother’s drug use or socioeconomic status, and the investigators call the link “robust and stable.” Other researchers, however, have questioned the way that the study’s authors defined attention deficit disorder. [Dimitri A. Christakis, Frederick J. Zimmermann, David L. DiGiuseppe, and Carolyn A. McCarty, “Early Television Exposure and Subsequent Attentional Problems in Children,” Pediatrics, April 2004. Roger L. Bertholf, Steve Goodison, Dimitri A. Christakis, and Frederick J. Zimmerman, “Television Viewing and Attention Deficits in Children,” Pediatrics 2004.] Note: A later Danish study failed to replicate the findings; the study’s authors argue that this may have been because Danish infants watched far less television, and a certain threshold was not crossed by them.

  • The effect of television on cognitive development was studied in a later re-analysis of the Christakis etal. data set, this time focusing on about 1,700 children who were six years old between 1996 and 2000. Hours of television watched before age three caused lower scores on several different cognitive tests. Television watched between ages three and five improved scores on a reading recognition test and a short-term memory test, though not on others. [Frederick J. Zimmerman and Dimitri A. Christakis, “Children’s Television Viewing and Cognitive Outcomes,” Archives of Pediatric and Adolescent Medicine, July 2005.]
  • In a spring 2000 study of 410 third graders in northern California, students with a television in their bedroom scored lower on all tests than those without one, and students with a computer at home scored higher. The lowest scores belonged to students who newly acquired a bedroom television in the course of the study. However, self-reports by the students did not support the hypothesis that television use was displacing homework, in terms of time usage. [Dina L. G. Borzekowski and Thomas N. Robinson, “The Remote, the Mouse, and the No. 2 Pencil: The Household Media Environment and Academic Achievement among Third Grade Students,” Archives of Pediatric and Adolescent Medicine, July 2005.]
  • A September 1999 survey of 4,508 middle school students in New Hampshire and Vermont found that the more time children spent with television and video games during the week, the less likely they were to have excellent grades, and the more likely to have below-average grades. When the data were adjusted to control for covariates such as level of maternal support and child’s rebelliousness, the correlation between television and grades held up, but the correlation between video games and poor academic performance disappeared. In a later letter, the authors suggested that their evidence on video games was weak because few children in the study played video games for as many hours as they watched television, and argued that if they did, the study would have been able to detect video games’ effect on grades. Academic performance was also impaired by having more cable channels available at home, being allowed by parents to watch any kind of content, and being allowed by parents to watch R-rated movies. [Iman Sharif and James D. Sargent, “Association between Television, Movie, and Video Game Exposure and School Performance,” Pediatrics, 2006. Jerald J. Block, Iman Sharif, and James D. Sargent, “Lack of Association between Video Game Exposure and School Performance,” Pediatrics, February 2007.]
  • The effect of television on attention and learning difficulties was studied through interviews with 678 families in upstate New York between 1983 and 2004, when the families’ children were age fourteen, sixteen, twenty-two, and thirty-three. Fourteen-year-olds who watched one or more hours of television daily were more likely to have poor grades, to fail to complete high school, and not to attend college. The effect was present whether the children tested high or low on verbal skills, and whether or not their parents had completed college. Those who watched more than three hours of television a day at age fourteen were even more susceptible to these failures; they were twice as likely not to earn a college degree as those who had watched less than an hour of television a day at age fourteen. Adolescents who watched more television at age sixteen than at fourteen raised their risk of future failure; those who watched less lowered it. When environmental factors were controlled for, learning problems did not in themselves predict future television watching habits. In other words, television was shown to cause academic failure, but not the other way around. [Jeffrey G. Johnson, Patricia Cohen, Stephanie Kasen, and Judith S. Brook, “Extensive Television Viewing and the Development of Attention and Learning Difficulties during Adolescence,” Archives of Pediatric and Adolescent Medicine, May 2007.]
  • The effect of television on long-term academic achievement was studied through interviews with 980 New Zealanders born in 1972 and 1973, starting when they were age five and ending when they were twenty-six. The more television the subjects watched during the week in childhood and adolescence, the more likely they were to leave high school without a diploma, and the less likely to earn a college degree. The effects were present even after adjusting for IQ, socioeconomic status, and childhood behavioral problems. The authors considered it possible that a poor academic experience in high school might have been causing some of the increase in adolescent television watching, but point out that there could not have been any reverse causation at work in the strong correlation between childhood television watching and failure to earn a college degree. The less television a child watched, the better his educational outcome; there was “little support for the hypothesis that a small amount of television is beneficial.” [Robert J. Hancox, Barry J. Milne, and Richie Poulton, “Association of Television Viewing During Childhood with Poor Educational Achievement,” Archives of Pediatric and Adolescent Medicine, July 2005.]

Next: Does internet use improve or impair academic performance? Does it decrease the amount of time spent reading?

UPDATE (27 Feb. 2009): For ease in navigating, here’s a list of all the blog posts I wrote to supplement my New Yorker article “Twilight of the Books”:

Notebook: “Twilight of the Books” (overview)
Are Americans Reading Less?
Are Americans Spending Less on Reading?
Is Literacy Declining?
Does Television Impair Intellect?
Does Internet Use Compromise Reading Time?
Is Reading Online Worse Than Reading Print?
I also later talked about the article on WNYC’s Brian Lehrer Show and on KUER’s Radio West.
And, as a bonus round: Does media violence lead to real violence, and do video games impair academic performance?