(Excerpt posted with permission of the author)
Cold War Childhood
A story could start almost anywhere. This one begins at a moment startled by a rocket.
In the autumn of 1957, America was not at war … or at peace. The threat of nuclear annihilation shadowed every day, flickering with visions of the apocalyptic. In classrooms, “duck and cover” drills were part of the curricula. Underneath any Norman Rockwell painting, the grim reaper had attained the power of an ultimate monster.
Dwight Eisenhower was most of the way through his fifth year in the White House. He liked to speak reassuring words of patriotic faith, with presidential statements like: “America is the greatest force that God has ever allowed to exist on His footstool.” Or: “Without God there could be no American form of government, nor an American way of life. Recognition of the Supreme Being is the first—the most basic—expression of Americanism.” Such pronouncements drew a sharp distinction between the United States and the Godless Communist foe.
But on October 4, 1957, the Kremlin announced the launch of Sputnik, the world’s first satellite. God was supposed to be on America’s side, yet the Soviet atheists had gotten to the heavens before us. Suddenly the eagle of liberty could not fly nearly so high.
Sputnik was instantly fascinating and alarming. The American press swooned at the scientific vistas and shuddered at the military implications. Under the headline “Red Moon Over the U.S.,” Time quickly explained that “a new era in history had begun, opening a bright new chapter in mankind’s conquest of the natural environment and a grim new chapter in the cold war.” The newsmagazine was glum about the space rivalry: “The U.S. had lost its lead because, in spreading its resources too thin, the nation had skimped too much on military research and development. Russia’s victory in the satellite race proved that the U.S. had not tried hard enough.”
At a diplomatic party, Washington’s famed “hostess with the mostest” Perle Mesta bristled when an administration official told her that Sputnik would be forgotten in six months. Mesta shot back: “And in six months we may all be dead.” The White House tried to project calm; Eisenhower said the satellite “does not raise my apprehension, not one iota.” But many on the political spectrum heard Sputnik’s radio pulse as an ominous taunt.
A heroine of the Republican right, Clare Boothe Luce, said the satellite’s beeping was an “outer-space raspberry to a decade of American pretensions that the American way of life was a gilt-edged guarantee of our material superiority.” Newspaper readers learned that Stuart Symington, a Democratic senator who’d been the first secretary of the air force, “said the Russians will be able to launch mass attacks against the United States with intercontinental ballistic missiles within two or three years.” Most worrisome was the fact that Sputnik’s first-stage rocket had more than 200,000 pounds of thrust—eight times what the USA was prepared to put behind its first satellite launch, set for a few months later.
The heft of Sputnik made America seem like a space-age lightweight. “The few who are allowed to know about such things and are able to understand them are saying that the launching of so big a satellite signifies that the Soviets are much ahead of this country in the development of rocket missiles,” columnist Walter Lippmann wrote a week after the 184-pound Sputnik went aloft. He added: “In short, the fact that we have lost the race to launch the satellite means that we are losing the race to produce ballistic missiles. This in turn means that the United States and the western world may be falling behind in the progress of science and technology. This is a grim business.”
A New York Times article matter-of-factly referred to “the mild panic that has seized most of the nation since Russia’s sputnik was launched two weeks ago.” In another story, looking forward, Times science reporter William L. Laurence called for bigger pots of gold at the end of scientific rainbows: “In a free society such as ours it is not possible to channel human efforts’ without the individual’s consent and wholehearted willingness. To attract able and promising young men and women into the fields of science and engineering it is necessary first to offer them better inducements than are presently offered.”
As if to underscore that Sputnik hadn’t been a fluke, on November 3 the Soviet Union followed up by launching a second satellite—at 1,100 pounds, six times the weight of the first. While it orbited the Earth, the new capsule housed a dog whose live countenance, circling the planet every hour and three quarters, became a canine symbol of Russia’s triumph in space.
The autumn satellites of 1957 lit a fire under the federal government and the scientific establishment in the United States. For the U.S. space effort, progress came in fits and starts. On December 6, a test satellite dubbed Vanguard blew up seconds after firing. At last, in early February 1958, an American satellite—the thirty-pound Explorer—went into orbit. But four days later, a Vanguard launch again quickly fizzled with an explosion in the air. That month, the government set up its first space agency.
What had succeeded in powering the Explorer satellite into space was a military rocket, developed by a U.S. Army research team. The head of that team, the rocket scientist Wernher von Braun, was boosting the red-white-and-blue after the fall of his ex-employer, the Third Reich. In March 1958 he publicly warned that the U.S. space program was a few years behind the Russians.
Soon after dusk, while turning a skate key or playing with a hula hoop, children might look up to see if they could spot the bright light of a satellite arching across the sky. But they could not see the fallout from nuclear bomb tests, underway for a dozen years by 1958. The conventional wisdom, reinforced by the press, downplayed fears while trusting the authorities; basic judgments about the latest weapons programs were to be left to the political leaders and their designated experts.
Even with all the assurances during the decade, worries grew about health effects of radioactivity from above. But apologists often blamed the nefarious enemy. “On Your Guard: Reds Launch Scare Drive’ Against U.S. Atomic Tests,” said a 1955 Los Angeles Examiner headline over one nationally distributed column, which told of “a big Communist fear’ campaign to force Washington to stop all American atomic hydrogen bomb tests.” The Washington Post, the Chicago Daily News, and other major newspapers published similar messages from another syndicated columnist, David Lawrence, who wrote in a typical piece: “Evidence of a world-wide propaganda is accumulating. Many persons are innocently being duped by it and some well-meaning scientists and other persons are playing the Communist game unwittingly by exaggerating the importance of radioactive substances known as fallout.’” Lawrence portrayed the star-spangled bomb explosions as beneficial: “The Nevada tests are being conducted for a humanitarian purpose—to determine the best ways to help civilian defense—and not to develop stronger weapons of war.” Such claims were ludicrous. And dangerous.
In the community of Railroad Valley not far north of the Nevada Test Site, a boy named Martin Bardoli died of leukemia months after entering grade school in 1956. When his parents circulated a petition and sent it to government officials, Senator George Malone responded with a letter cautioning against unfounded alarm and adding “it is not impossible to suppose that some of the scare’ stories are Communist inspired.”
On the weekly prime-time Walt Disney television show, an animated fairy with a magic wand urged youngsters to drink three glasses of milk each day. But airborne strontium-90 from nuclear tests was falling on pastures all over, migrating to cows and then to the milk supply and, finally, to people’s bones. Radioactive isotopes from fallout were becoming inseparable from the human diet.
The more that work by expert scientists endangered us, the more we were informed that we needed those scientists to save us. Who better to protect Americans from the hazards of the nuclear industry and the terrifying potential of nuclear weapons than the best scientific minds serving the industry and developing the weapons?
In June 1957—the same month Nobel Prize—winning chemist Linus Pauling published an article estimating that ten thousand cases of leukemia had already occurred due to U.S. and Soviet nuclear testing—President Eisenhower proclaimed that the American detonations would result in nuclear warheads with much less radioactivity. Ike said that “we have reduced fallout from bombs by nine-tenths,” and he pledged that the Nevada explosions would continue in order to “see how clean we can make them.” The president spoke just after meeting with Edward Teller and other high-powered physicists. Eisenhower assured the country that the scientists and the U.S. nuclear test operations were working on the public’s behalf. “They say: Give us four or five more years to test each step of our development and we will produce an absolutely clean bomb.’” But sheer atomic fantasy, however convenient, was wearing thin.
Many scientists actually opposed the aboveground nuclear blasts. Relying on dissenters with a range of technical expertise, Democratic nominee Adlai Stevenson had made an issue of fallout in the 1956 presidential campaign. During 1957—a year when the U.S. government set off thirty-two nuclear bombs over southern Nevada and the Pacific—Pauling spearheaded a global petition drive against nuclear testing; by January 1958 more than eleven thousand scientists in fifty countries had signed.
Clearly, the views and activities of scientists ran the gamut. But Washington was pumping billions of tax dollars into massive vehicles for scientific research. These huge federal outlays were imposing military priorities on American scientists without any need for a blatant government decree.
The book that I’ve remembered most vividly from my childhood, David and the Phoenix, was a selection of the Weekly Reader Children’s Book Club in 1958. The story packed an emotional wallop, with themes that foreshadowed decades of conflicts involving science, careers, violence, and reverence for life.
It’s summer, and David’s family moves into a house with a wondrous mountain just behind the backyard. David, maybe ten years old, climbs the mountain and discovers a large, awesome bird. The Phoenix is glad to assist with the boy’s education, which the erudite bird is quick to distinguish from schooling. (“Life is real, life is earnest. One must face it with a practical education.”) Transported on the Phoenix’s back, David goes to fascinating and mystical places. But there’s danger lurking, as the Phoenix explains: “I had been here no more than three months when a Scientist was hot on my trail. A most disagreeable fellow, always sneaking about with binoculars, a camera, and, I fear, a gun.”
Down from the mountain one night, David walks into the living room only to discover that his mother and father are hosting the Scientist, who is talking excitedly. “It’s the discovery of the age,” the honored guest is saying. “My name will be famous if I succeed in my plans.”
The Scientist finally closes in—as it happens, on Phoenix’s five-hundredth birthday—a day when, not quite knowing why, the fantastic bird has built a pyre. After David and Phoenix enjoy a lovely picnic on the mountainside, Phoenix sprinkles the pyre with cinnamon. And David realizes, with horror, what is about to happen. Averting his eyes, the child hears the scrape of a match, then crackling branches … indistinct time passes … and then, through a smoky haze, he sees the charred pile stir, and a magnificent young bird emerges. And then, from partway down the mountain, comes the sound of a man shouting. It’s the Scientist, running up the trail and waving a rifle.
Paralyzed with fear, David remained on his knees as the Scientist reached an open place and threw the gun up to his shoulder. The bullet went whining by with an ugly hornet-noise, and the report of the gun echoed along the scarp.
“Fly, Phoenix!” David sobbed. A second bullet snarled at the bird, and spattered out little chips of rock from the inner wall of the ledge.
“Oh, fly, fly!” David jumped up and flung himself between the bird and the Scientist. “It’s me!” he cried. “It’s David!” The bird gazed at him closely, and a light flickered in its eye as though the name had reached out and almost, but not quite, touched an ancient memory. Hesitantly it stretched forth one wing, and with the tip of it lightly brushed David’s forehead, leaving there a mark that burned coolly.
“Get away from that bird, you little idiot!“ the Scientist shrieked. “GET AWAY!“
David ignored him. “Fly, Phoenix!” he cried, and he pushed the bird toward the edge.
Understanding dawned in the amber eyes at last. The bird, with one clear, defiant cry, leaped to an out-jutting boulder. The golden wings spread, the golden neck curved back, the golden talons pushed against the rock. The bird launched itself into the air and soared out over the valley, sparkling, flashing, shimmering; a flame, large as a sunburst, a meteor, a diamond, a star, diminishing at last to a speck of gold dust, which glimmered twice in the distance before it was gone altogether.
While many scientists climbed toward career peaks as fast as their brains would carry them, the continuation of life was in the crosshairs of very big guns. For the first time, weaponry at hand could bag the game with absolute finality: turning the current generations to ash all at once, with no one left to mourn or to carry on. The thermonuclear invention might end all death and life, courtesy of the most “advanced” science that money could buy.
The U.S. Treasury kept funneling billions into science with a doomsday twist. The trend had become evident soon after the Second World War. In autumn 1946, speaking at a public-affairs forum in New York, atomic physicist Philip Morrison noted that the U.S. military was funding a hefty portion of scientific research. “Some schools,” he said, “derive 90 percent of their research support from navy funds.” Morrison saw where the juggernaut was headed: “The now amicable contracts will tighten up and the fine print will start to contain talk about results and specific weapon problems. And science itself will have been bought by war on the installment plan.”
The purchase was apparent. As Morrison commented, “The physicist knows the situation is a wrong and dangerous one. He is impelled to go along because he really needs the money.” By the time the century reached its midpoint, several dozen major universities held large nuclear contracts with the government.
In a lament that aired on NBC Radio in early 1950, the physics pioneer Leo Szilard—whose prewar work had made sustained chain reactions possible—raised a warning about the slippery slope to mass destruction. “In 1939 when we tried to persuade the government to take up the development of atomic energy, American public opinion was undivided on the issue that it is morally wrong and reprehensible to bomb cities and to kill women and children,” he said. “During the war, almost imperceptibly, we started to use giant gasoline bombs against Japan, killing millions of women and children; finally we used the A-bomb. I believe there is a general uneasiness among the scientists. It is easy for them to agree that we cannot trust Russia, but they also ask themselves: To what extent can we trust ourselves?”
Such provocative questions went largely ignored. The decision to develop the hydrogen bomb followed a brief and secretive high-level debate that President Truman settled in 1950. Truman brushed off the physicists who counseled against going ahead with the “super bomb”—scientists were mere formula-crunchers with little political clout, unless their prestige and zeal helped propel Washington’s top policymakers where they wanted to go. The same hierarchy that asserted its civilian control over the military also asserted its civilian authority over science, if only by dint of appropriations. A physicist with no budget might just as well be in a sandbox.
Truman rejected a somber report from the Atomic Energy Commission’s advisory committee. (The chairman of the panel, J. Robert Oppenheimer, had become a national hero four years earlier for leading the secret effort to develop the atom bomb at the Los Alamos laboratory.) Assessing the hydrogen bomb, the report said: “It is clear that the use of this weapon would bring about the destruction of innumerable human lives; it is not a weapon which can be used exclusively for the destruction of material installations of military or semi-military purposes. Its use therefore carries much further than the atomic bomb itself the policy of exterminating civilian populations.” A hydrogen bomb could top the destructive power of an atomic bomb by a factor of hundreds. At the fulcrum of the twentieth century, going ahead with H-bombs would catapult the world to the brink of full-blown nuclear holocaust.
The Los Alamos lab began joint work with the new Lawrence Livermore laboratory, which focused on the hydrogen bomb from the day its doors opened in 1952. Both labs operated under the aegis of the University of California; the academic affiliation served as a useful air freshener to cover the stench of Armageddon technology. Across the country a labyrinth of top-clearance facilities cranked out the collaborative work of academia, profit-driven contractors, and government agencies. Incalculable resources fueled the Bomb—the capital “B” would later fade as the presence of nuclear weapons became routine—an immutable fact of life.
“Delivery systems” could be faster and more elusive; “payloads” smaller and more powerful. In the early ’50s, the first H-bombs were the size of large buildings, set off on Pacific islands far from American shores. News reports and Washington’s political viewfinders abstracted into fuzziness the horrific realities of nuclear tests. Tropical locales of inestimable beauty, amid green dollops and sandy spits in the ocean, with names like Bikini and Eniwetok, vaporized in a split-atom second that flashed ultrabright, and then blotted out the sun. Repeatedly, a lacquer of radioactive isotopes settled onto a former paradise; thick, white fallout sometimes coated beaches, foliage, and the tops of palm trees. As the decade went on, cancer and birth defects began to afflict native islanders. Meanwhile, far away, Americans embraced risquï¿½ bathing suits known as bikinis.
Young people—dubbed “baby boomers,” a phrase that both dramatized and trivialized them—were especially vulnerable to strontium-90 as their fast-growing bones absorbed the radioactive isotope along with calcium. The children who did as they were told by drinking plenty of milk ended up heightening the risks—not unlike their parents, who were essentially told to accept the bomb fallout without complaint.
Under the snappy rubric of “the nuclear age,” the white-coated and loyal American scientist stood as an icon, revered as surely as the scientists of the enemy were assumed to be pernicious. And yet the mutual fallout, infiltrating dairy farms and mothers’ breast milk and the bones of children, was a type of subversion that never preoccupied J. Edgar Hoover.
What was being suppressed might suddenly pop up like some kind of jack-in-the-box. Righteous pressure against disruptive or “un-American” threats was internal and also global, with a foreign policy based on containment. Control of space, inner and outer, was pivotal. What could not be controlled was liable to be condemned.
The ’50s and early ’60s are now commonly derided as unbearably rigid, but much in the era was new and stylish at the time. Suburbs boomed along with babies. Modern household gadgets and snazzier cars appeared with great commercial fanfare while millions of families, with a leg up from the GI Bill, climbed into some part of the vaguely defined middle class. The fresh and exciting technology called television did much to turn suburbia into the stuff of white-bread legends—with scant use for the less-sightly difficulties of the near-poor and destitute living in ghettos or rural areas where the TV lights didn’t shine. On the surface, most kids lived in a placid time, while small screens showed entertaining images of sanitized life. One among many archetypes came from Betty Crocker cake-mix commercials, which were all over the tube; the close-ups of the icing could seem remarkable, even in black and white. Little girls who had toy ovens with little cake-mix boxes could make miniature layer cakes.
Every weekday from 1955 to 1965 the humdrum pathos of women known as housewives could be seen on Queen for a Day. The climax of each episode came as one of the competitors, often sobbing, stood with a magnificent bouquet of roses suddenly in her arms, overcome with joy. Splendid gifts of brand-new refrigerators and other consumer products, maybe even mink stoles, would elevate bleak lives into a stratosphere that America truly had to offer. The show pitted women’s sufferings against each other; victory would be the just reward for the best, which was to say the worst, predicament. The final verdict came in the form of applause from the studio audience, measured by an on-screen meter that jumped with the decibels of apparent empathy and commiseration, one winner per program. Solutions were individual. Queen for a Day was a nationally televised ritual of charity, providing selective testimony to the goodness of society. Virtuous grief, if heartrending enough, could summon prizes, and the ecstatic weeping of a crowned recipient was vicarious pleasure for viewers across the country, who could see clearly America’s bounty and generosity.
That televised spectacle was not entirely fathomable to the baby-boom generation, which found more instructive role-modeling from such media fare as The Adventures of Spin and Marty and Annette Funicello and other aspects of the Mickey Mouse Club show—far more profoundly prescriptive than descriptive. By example and inference, we learned how kids were supposed to be, and our being more that way made the media images seem more natural and realistic. It was a spiral of self-mystification, with the authoritative versions of childhood green-lighted by network executives, producers, and sponsors. Likewise with the sitcoms, which drew kids into a Potemkin refuge from whatever home life they experienced on the near side of the TV screen.
Dad was apt to be emotionally aloof in real life, but on television the daddies were endearingly quirky, occasionally stern, essentially lovable, and even mildly loving. Despite the canned laugh tracks, for kids this could be very serious—a substitute world with obvious advantages over the starker one around them. The chances of their parents measuring up to the moms and dads on Ozzie and Harriet or Father Knows Best were remote. As were, often, the real parents. Or at least they seemed real. Sometimes.
Father Knows Best aired on network television for almost ten years. The first episodes gained little momentum in 1954, but within a couple of years the show was one of the nation’s leading prime-time psychodramas. It gave off warmth that simulated intimacy; for children at a huge demographic bulge, maybe no TV program was more influential as a family prototype.
But seventeen years after the shooting stopped, the actor who had played Bud, the only son on Father Knows Best, expressed remorse. “I’m ashamed I had any part of it,” Billy Gray said. “People felt warmly about the show and that show did everybody a disservice.” Gray had come to see the program as deceptive. “I felt that the show purported to be real life, and it wasn’t. I regret that it was ever presented as a model to live by.” And he added: “I think we were all well motivated but what we did was run a hoax. We weren’t trying to, but that is what it was. Just a hoax.”
In TV-land, as elsewhere, hoaxsters could earn a living, sometimes a very good one. There was no cabal. That was the system. Hoaxing was most of all about coaxing money out of pockets. And the proliferation of advertising on the increasingly powerful new medium was the essence of hoax. On television, hucksterism boomed through programs and commercials alike.
Mad Magazine was the only mass-distributed challenge to the saturating culture of hoax in the 1950s. While Consumer Reports tried to counteract advertising with factual evaluations of product quality, Mad’s satiric mission went for the jugular. The grinning young icon Alfred E. Neuman served as a zany alter ego for readers while the editors promoted slyly subversive sensibilities. For many kids, Mad was the first public source to acknowledge that respectables were lying to them on a regular basis—the first methodical exposure of absurd gaps between pretenses and realities. The professionals at work along Madison Avenue and Pennsylvania Avenue were frequent targets; so were more general patterns of conceits. For instance, in an era of new plastics widely regarded as virtuously antiseptic, a Mad cartoon spoofed such fixations: “Untouched by human hands,” said a billboard outside a factory. Inside, chimpanzees were at the assembly line.
Symbolic of the shift into the 1960s was the election of a young president who had baby-boom children. John F. Kennedy arrived with pledges of renewal after campaigning with false claims that the USA was on the short end of a “missile gap” with the Soviet Union. He often emphasized science as the way to explore the new frontier on Earth and in space.
During the same autumn JFK won the presidency, John Hersey came out with The Child Buyer, a novel written in the form of a hearing before a state senate committee. “Excuse me, Mrs., but I wonder if you know what’s at stake in this situation,” a senator says to the mother of a ten-year-old genius being sought for purchase by the United Lymphomilloid corporation. “You realize the national defense is involved here.”
“This is my boy,” the mom replies. “This is my beautiful boy they want to take away from me.”
A vice president of United Lymphomilloid, “in charge of materials procurement,” testifies that “my duties have an extremely high national-defense rating.” He adds: “When a commodity that you need falls in short supply, you have to get out and hustle. I buy brains. About eighteen months ago my company, United Lymphomilloid of America, Incorporated, was faced with an extremely difficult problem, a project, a long-range government contract, fifty years, highly specialized and top secret, and we needed some of the best minds in the country …”
Soon, most of the lawmakers on the committee are impressed with the importance of the proposed purchase for the nation. So there’s some consternation when the child buyer reports that he finally laid his proposition “squarely on the table”—and the boy’s answer was no.
Senator Skypack exclaims: “What the devil, couldn’t you go over his head and just buy him?”
The Child Buyer is a clever send-up, with humor far from lighthearted. Fifteen years after Hersey did firsthand research for his book Hiroshima, the Cold War had America by the throat. The child buyer (whose name, as if anticipating a Bob Dylan song not to be written for several more years, is Mr. Jones) tells the senate panel that his quest is urgent, despite the fifty-year duration of the project. “As you know, we live in a cutthroat world,” he says. “What appears as sweetness and light in your common television commercial of a consumer product often masks a background of ruthless competitive infighting. The gift-wrapped brickbat. Polite legal belly-slitting. Banditry dressed in a tux. The more so with projects like ours. A prospect of perfectly enormous profits is involved here. We don’t intend to lose out.”
And what is the project for which the child will be bought? A memorandum, released into the hearing record, details “the methods used by United Lymphomilloid to eliminate all conflict from the inner lives of the purchased specimens and to ensure their utilization of their innate equipment at maximum efficiency.” First comes solitary confinement for a period of weeks in “the Forgetting Chamber.” A second phase, called “Education and Desensitization in Isolation,” moves the process forward. Then comes a feeding Period”; then major surgery that “consists of tying off’ all five senses”; then the last, long-term phase called “Productive Work.” Asked whether the project is too drastic, Mr. Jones dismisses the question: “This method has produced mental prodigies such as man has never imagined possible. Using tests developed by company researchers, the firm has measured I.Q.’s of three fully trained specimens at 974, 989, and 1005 …”
It is the boy who brings a semblance of closure on the last day of the hearing. “I guess Mr. Jones is really the one who tipped the scales,” the child explains. “He talked to me a long time this morning. He made me feel sure that a life dedicated to U. Lympho would at least be interesting. More interesting than anything that can happen to me now in school or at home…. Fascinating to be a specimen, truly fascinating. Do you suppose I really can develop an I.Q. of over a thousand?”
But, a senator asks, does the boy really think he can forget everything in the Forgetting Chamber?
“I was wondering about that this morning,” the boy replies. “About forgetting. I’ve always had an idea that each memory was a kind of picture, an insubstantial picture. I’ve thought of it as suddenly coming into your mind when you need it, something youseen, something you’ve heard, then it may stay awhile, or else it flies out, then maybe it comes back another time. I was wondering about the Forgetting Chamber. If all the pictures went out, if I forgot everything, where would they go? Just out into the air? Into the sky? Back home, around my bed, where my dreams stay?”
I went to the John Glenn parade in downtown Washington on February 26, 1962, a week after he’d become the first American to circle the globe in a space capsule. Glenn was a certified hero, and my school deemed the parade a valid excuse for an absence. To me, a fifth grader, that seemed like a good deal even when the weather turned out to be cold and rainy.
For the new and dazzling space age, America’s astronauts served as valiant explorers who added to the ï¿½lan of the Camelot mythos around the presidential family. The Kennedys were sexy, exciting, modern aristocrats who relied on deft wordsmiths to produce throbbing eloquent speeches about freedom and democracy. The bearing was American regal, melding the appeal of refined nobility and touch football. The media image was damn-near storybook. Few Americans, and very few young people of the era, were aware of the actual roles of JFK’s vaunted new “special forces” dispatched to the Third World, where—below the media radar—they targeted labor-union organizers and other assorted foes of U.S.-backed oligarchies.
But a confrontation with the Soviet Union materialized that could not be ignored. Eight months after the Glenn parade, in tandem with Nikita Khrushchev, the president dragged the world to a nuclear precipice. In late October 1962, Kennedy went on national television and denounced “the Soviet military buildup on the island of Cuba,” asserting that “a series of offensive missile sites is now in preparation on that imprisoned island.” Speaking from the White House, the president said: “We will not prematurely or unnecessarily risk the costs of worldwide nuclear war in which even the fruits of victory would be ashes in our mouth—but neither will we shrink from that risk at any time it must be faced.”
In our household, an elder half-heartedly piled cans of food and bottled water next to the ping-pong table in the basement. I didn’t know enough to be very worried, but my parents seemed edgy. So did my teacher, who saw kids glancing at the clock on the classroom wall and commented that she knew we must be thinking about the U.S. ships scheduled to begin enforcing a naval quarantine around Cuba; actually I had been eager to get out to recess. At the time, most children didn’t understand what came to be known as the Cuban Missile Crisis; it was mainly frightening in retrospect, when we realized that the last word could have been annihilation.
Early in the next autumn, President Kennedy signed the Limited Test Ban Treaty, which sent nuclear detonations underground. The treaty was an important public health measure against radioactive fallout. Meanwhile, the banishment of mushroom clouds made superpower preparations for blowing up the world less visible. The new limits did nothing to interfere with further development of nuclear arsenals.
Kennedy liked to talk about vigor, and he epitomized it. Younger than Eisenhower by a full generation, witty, with a suave wife and two adorable kids, he was leading the way to open vistas. Store windows near Pennsylvania Avenue displayed souvenir plates and other Washington knickknacks that depicted the First Family—standard tourist paraphernalia, yet with a lot more pizzazz than what Dwight and Mamie had generated.
A few years after the Glenn parade, when I passed the same storefront windows along blocks just east of the White House, the JFK glamour had gone dusty, as if suspended in time, facing backward. I thought of a scene from Great Expectations. The Kennedy era already seemed like the room where Miss Havisham’s wedding cake had turned to ghastly cobwebs; in Dickens’ words, “as if a feast had been in preparation when the house and the clocks all stopped together.”
The clocks all seemed to stop together on the afternoon of November 22, 1963. But after the assassination, the gist of the reputed best-and-brightest remained in top Cabinet positions. The distance from Dallas to the Gulf of Tonkin was scarcely eight months as the calendar flew. And soon America’s awesome scientific capabilities were trained on a country where guerrilla fighters walked on the soles of sandals cut from old rubber tires.