ROOM ZKE
Translation Page | USAComment.com
USAComment.com
Zicutake USA Comment | Search Articles



#History (Education) #Satellite report #Arkansas #Tech #Poker #Language and Life #Critics Cinema #Scientific #Hollywood #Future #Conspiracy #Curiosity #Washington
 Smiley face
PROXY LIST
 Smiley face
Zicutake BROWSER
 Smiley face Encryption Text and HTML  Smiley face Conversion to JavaScript 
 Smiley face Mining Satoshi | Payment speed 
 Smiley face
CREATE ADDRESS BITCOIN
Online BitTorrent Magnet Link Generator
JOURNAL WORLD:

SEARCH +8 MILLIONS OF LINKS ZICUTAKE STATE

#Culture_and_Technology

#Culture_and_Technology


Why Is the U.S. So Bad at Protecting Workers From Automation?

Posted: 04 Jan 2018 04:00 AM PST

In a televised speech to the nation in February 1981, then-President Ronald Reagan warned that 7 million Americans were caught up "in the personal indignity and human tragedy of unemployment." "If they stood in a line, allowing three feet for each person," he explained, "the line would reach from the coast of Maine to California." The unemployed were not only former assembly-line workers—they were secretaries, accountants, and cashiers, among other professions, too.

The 1980 recession had bludgeoned the labor market, and the 1981-82 recession was about to bludgeon it even more. Manufacturing plants across the country were shutting down or relocating. IBM was about to introduce the personal computer, rendering skills such as shorthand all but useless. A crack epidemic was ravaging cities. The unemployment rate, which had been hovering around 7 percent from 1980 to 1981, was rising; it would be close to 11 percent by 1982. The federal government's main source of funding for job training was due to expire a year later.

From one vantage point, today's workers seem to be facing a vastly different landscape: The U.S. has somewhat recovered from an historic recession, unemployment is low at 4 percent, and the manufacturing sector is growing again, albeit slowly. But many aspects of today's climate echo that of the 1980s: The global economy is expanding, new technology is entering the workplace at every turn, and an opioid epidemic is ravaging communities. Automation will create new jobs, but analyses show that they'll require digital skills that only the most educated and experienced people will have. Researchers today are predicting that white- and blue-collar workers will be in predicaments similar to their counterparts in the 1980s and '90s, as soon as 2020, if they aren't already there.

As a response to these challenges, many policymakers and business leaders turn to a federal policy solution that has stuck around since before Reagan's day: job-training programs. These types of policies remain popular—at least in name, if not in investment—despite little evidence that they succeed on a large scale. In 2014, then-President Barack Obama signed the Workforce Innovation and Opportunity Act. And over the summer, Senator Robert Menendez and Representative Albio Sires, both New Jersey Democrats, introduced the Better Education and Skills Training for America's Workforce Act.

Reagan's version of this was the Job Training Partnership Act (JTPA) of 1982, which spent nearly $3 billion yearly between 1984 and 1998 and made some adjustments to the previous job-training formula by removing provisions for subsidized jobs in local and state governments and giving more control to the private sector. Although JTPA accounted for a relatively small share of the president's response to unemployment, it was arguably Reagan's most direct response to workers. And it worked about as poorly as any other program has.

Its flaw wasn't Reagan's approach. Rather, studies suggest it suffered from the challenges typical of federal job-training programs even before Reagan's time. Such programs have historically been unable to change an economy in which low-wage workers suffer from both low pay and a lack of autonomy. The initiatives have struggled to achieve their mission for several reasons. One, those who need the training typically don't know about—or are excluded from—them. Two, course material tends to be disjointed from the needs of employers. Three, and perhaps most importantly, job-training programs don't force employers to pay skilled people decent wages. It's worth taking a closer look at why programs like this go awry.

I spoke with experts who have studied job-training programs in the United States and abroad. They say Reagan's JPTA epitomizes various job-training programs that have been spun up over the years in America; indeed, its track record echoes what has happened in training programs before and since. As Americans prepare for the next wave of innovation, history shows that new technologies have affected white- and blue-collar workers differently. In turn, although policymakers often talk about a single workforce, when recessions and automation strike, the fortunes of white- and blue-collar workers are likely to diverge, and training is largely an ineffective approach to the needs of the latter. History also shows that any successful response to automation's impact will have to recognize that training can never be enough to shepherd Americans into a new economy.

For the country's white-collar workers (those who worked in office and administrative settings) automation wasn't a job killer but rather a job changer. White-collar workers had to contend with the advent of personal computers and word-processing systems, but rarely did they lose their jobs because of the new technology. For them, the changes amounted to little more than a new chapter, as companies tended to offer some in-house training to help them keep abreast of the shifts.

In the middle of the 1990s, a decade during which office automation was rampant, employees in establishments with 50 or more workers received an average of 10.7 hours of formal training per year, according to the Bureau of Labor Statistics. Some companies took extra steps to ensure their employees weren't left behind. The Nashville, Arkansas, accounting firm Woods & Woods is one example. Its owners traveled from IBM seminar to seminar, learning how to use the new personal computer so that they could train their staff. Meanwhile, white-collar workers whose companies didn't offer training went back to school, learned new skills on their own, or waited for a new job that fit their needs. Workers could wait if they had employers in their social networks who could connect them to jobs or had healthy savings accounts on which they could depend.

For the country's blue-collar workers, on the other hand, automation was a job killer, as was globalization. Blue-collar workers employed in factory or construction-type settings found themselves grappling primarily with the latter phenomenon. Some companies partnered with unions, such as the United Automobile Workers and the Communications Workers of America, to train workers. But although unions were eligible for JTPA funding, a very small percentage of it went to these programs. And regardless of the funding source, training for blue-collar workers was not as effective as was training for their white-collar counterparts: These laborers weren't merely losing jobs because they didn't know how to use the latest technology—they lost jobs because people in places such as Japan and China could do the work cheaper and faster.

What's alarming today is that companies are less loyal to their workers than they were then. For example, companies in the 1970s reinvested the majority of their profits into research and development, retraining their workers, and raising salaries. But now, of the nearly 500 large companies in the S&P 500 index, half of their profits were spent on buybacks and roughly 40 percent were spent on dividends, instead of on workers. Many job vacancies are caused not by a lack of skilled workers, some labor economists argue, but rather by employers' inability to create good working conditions for workers. What's more, unions have lost influence—20 percent of employed workers were union members in the mid 1980s, compared to around 11 percent in 2014—and with that the power to force companies to protect workers. This reality makes it all the more important to get job-training right.

Federal policy has consistently failed at training, focusing for decades on the worker and the question of what makes that worker well-trained; it's paid little attention to employers and the question of how they can change to better recruit and retain employees. But what if many of the 6 million jobs that are currently vacant are open not because people aren't qualified but rather because employers aren't hiring? What happens when a company is more interested in employing someone who's willing to work in dreadful conditions for less than a decent wage than in someone who has computer literacy? And what's the upshot in a society that lacks unions that would otherwise force better conditions? It's critical that policymakers and business leaders consider these questions as they help design future job-training and workforce-development initiatives.

* * *

After studying job-training programs for 20 years, Gordon Lafer learned that they don't work. Lafer, the author of The Job Training Charade, wasn't always disenchanted with federal job-training programs. Back in the mid '80s, when JTPA was first implemented, Lafer was working as an economic-policy analyst in the office of then-New York City Mayor Ed Koch. Filling job vacancies and addressing unemployment was one of the office's top priorities. Lafer, like most of his colleagues, believed that job training was the best solution; the challenge, he assumed, was that programs needed more funding.

Lafer was convinced otherwise four years later. He recalled that Koch had been meeting with executives at sheet-metal-firms, and one of the executives allegedly told the mayor something along the lines of: Even for 10 bucks an hour, you can't get people who want to work. The sheet-metal-executive didn't say that he had open positions paying that wage, but Koch apparently took it that way and in speeches around the city told New Yorkers that $10-an-hour jobs were waiting for them. Letters from struggling single parents and the homeless soon filled the office's mailbox. And those letter-writers all asked one question: Where are those jobs?  

It turned out that many of the open jobs actually paid below minimum wage, and Lafer drafted responses apologizing for the mayor's error. But the envelopes never made it out of the office. You can't possibly tell people that the mayor was wrong, his supervisor allegedly instructed him. Tell those people to ask their local job-training centers where those $10-an-hour jobs are instead. Lafer was dumbfounded.

Those job centers were funded through JTPA—and if the experience of the policy's beneficiaries elsewhere in the country is any indication, those job centers would've done little to help land those struggling New Yorkers decent-paying jobs.

Several studies have explored whether Lafer's experience was the norm. JTPA would be deemed successful by the federal government's standards if participants got jobs, earned better wages than they did previously, and were less likely to depend on welfare relative to their time before training. It's worth noting that a relatively low percentage of unemployed Americans participated, and most of those who did participate came from manufacturing. In the most extensive evaluation of a federal job-training program at the time, the Labor Department scrutinized enrollment info and employment outcomes from programs in 16 regions across the country representing around 20,000 Americans; it also collected additional data, such as food-stamp records, from some of the regions and consulted with vocational schools to assess the costs of their programs as a comparison, among other analyses. Another prominent study published in Cornell University's Industrial and Labor Relations Review looked at participation in JTPA in Tennessee, comparing enrollee information to data from a monthly survey of U.S. households conducted by the U.S. Census Bureau and Bureau of Labor Statistics. A working paper published by the National Bureau of Economic Research also analyzed participation at four sites.

The studies all found that JTPA was largely unsuccessful. It had a "modest positive impact" on disadvantaged adults' earnings, hovering around an increase of $500, according to the National JTPA analysis. The IRL Review study found that programs were "cream-skimming," or only accepting into their programs those who were bound to get a job; those who were most in need, it found, were driven away to ensure the programs had high success rates. The NBER paper found that people self-selected into the programs, and so those who would get jobs were likely the ones who were proactive in finding training.

Andrea Chronopoulos

Harry Holzer, a fellow at the Brookings Institution, emphasized that analyses have shown mixed results; JTPA's limited success, he argued, has been overstated. (Conducting more follow-up evaluations is one way to get a clearer picture of what exactly the programs achieve, Sires, the New Jersey representative who co-introduced a job-training bill this past summer, told me.) But other scholars disagree with Holzer, offering various reasons for JTPA's minimal success over the years: They contend that the programs are too divorced from employers' needs, too unrelated to workers' interests, too light-touch, and too limited in their reach, among other flaws. And these same flaws explain why earlier federal job-training programs failed, too.

So why have federal programs, with their focus on changing the worker rather than the job, remained the solution of choice? The answer is simple: It's much harder to force employers to change people's wages and work conditions—especially when those companies can offshore to foreign countries and employ foreign workers—than it is to convince them to place money and responsibility in the hands of government officials.

It's important to stress that many people affected by automation and globalization had much more positive experiences than those New Yorkers who could not find decent-paying jobs in the 1980s. Cecilia Mejia's trajectory is one example. Having participated in 1988 in a youth program in Arizona that was funded by JTPA, Mejia is now the director of Santa Cruz County's public fiduciary office, and she credits her career climb to the digital skills such as making spreadsheets on Excel that she learned in the youth program.

Citing examples like Mejia and the increased availability of intricate, real-time data on the needs of local economies, some policymakers are convinced that effective federal job-training is possible.

Still, as Lafer, who now works at the left-leaning Economic Policy Institute, stressed, anecdotes are not enough to inform policy. After 20 years of studies, he believes there is enough data to show that the programs (though perhaps grounded in a genuine interest in expanding social supports) cannot work on a large scale. According to Lafer, that's because they're rooted in a superficial understanding of unemployment—in a misguided conviction that people don't have jobs simply because they don't have the skills. These programs won't work, he emphasized, until they acknowledge the possibility that job vacancies are also, perhaps primarily, caused by employers that are disloyal and inadvertently create poor working conditions for their workers. Until then, the programs will remain purely symbolic.

* * *

The story of Roberta Gainer, who was both a worker and a job-trainer, shows how ineffective these programs can be. Six days a week, Roberta Gainer commuted every morning from Niagara Falls to Buffalo, New York, to work. Starting at dawn, she'd spend nine hours daily putting rocker arms on engines, a difficult job she'd had since graduating from high school. Modern-day engines will wait for a laborer to put the pieces on, Gainer told me, but back then, workers had to run down the line chasing the engine—or worse, stop the line, which affected everyone on the floor. Automation and globalization didn't spare her—even though she'd been trained to become a trainer for her fellow workers through a fund sponsored by her union and Ford, Chrysler, and General Motors. Gainer got laid off after two years on the job, but according to her, it was not because she was unable to complete her tasks.

The training she received was intensive. Academics, mainly from Cornell's labor-relations school and Erie Community College, stood at the front of classrooms alongside factory-workers like Gainer teaching laborers about the economy, the history of the labor movement, and how to use and repair computers. Workers also got help with resume-writing and the job search. Gainer learned so much that she was able to train her colleagues for jobs in construction and entry-level positions in the medical and mechanical fields. "I was always nervous to teach, before every class," Gainer said, but she managed the pressure by reminding herself that no one knew or cared if she was making a mistake. Indeed the training was as impractical as it was intensive: Workers weren't interested in classroom training—they wanted education that was more hands-on. Given that even Gainer's qualifications ultimately didn't guarantee her job security, perhaps those workers were onto something.

Marc Perrone's experience further supports conclusions that job-training programs are ineffective. His experience as a blue-collar worker was so tumultuous, it inspired him to become a leader in the labor movement. In blue-collar work, the proliferation of technology means that virtually no skill is evergreen— even newly adopted skills will become less valuable over time. Tasks that once took hours now take minutes; machines become easier to understand. Perrone was a victim of this reality and, like Gainer, ultimately couldn't retain a full-time job despite being well-trained.

Perrone's first job, as a butcher's assistant, required him to work with large slabs of hanging beef in the meat departments of three different grocery stores in Pine Bluff, Arkansas. He learned how to cut the meat, package it, and price it for sale. He was eventually moved over to the retail side (those under 18 weren't supposed to operate the sharp machinery in the meat department), where he was responsible for stocking on the shelves and ringing up grocery items. The machines didn't do mathematical calculations at the time, so he had to add up the prices in his head; UPC barcodes didn't exist, so he had to memorize the cost of each item. Companies like Kroger's, the grocery stores at which he worked as a stocker and cashier, had substantial training programs to help employees adapt to new technology such as computerized cash registers, and he took advantage of those opportunities and within two years had advanced to a journeyman, the ranking just below master.

But according to Perrone, no amount of training would fully guarantee job security. It wasn't like the new technology required that many new skills; in fact, it often "deskilled" their work, requiring less knowledge than employees needed before the digitization took over.  

In the '80s, the U.S. was beginning to move from an economy in which many employers had strong, long-term relationships with their employees to one in which companies and workers often had shorter-term relationships. In that new universe, technology often and quickly rendered skills obsolete. Training helped workers up to a certain point, and then they were on their own—especially if their employer was more interested in productivity and shareholder value than in the workers themselves. Perrone's belief that workers needed more benefits and supports led him to the helm of the United Food and Commercial Workers Union, where he still serves as president.

Perrone and Gainer ended up doing okay for themselves. After being laid off from her rocker-arms job, Gainer, who was displaced from the auto industry for a total of nearly five years, worked at another manufacturing plant for six months making sandpaper. She subsequently traded in her assembly-line jumpsuit for business-casual attire, landing a job as a customer-service representative at Goldome, a mortgage-bank company, where she helped people try to save their homes. General Motors eventually called her back, as was common in the auto-industry after dry spells, and she returned without hesitation. Perrone, on the other hand, eventually left his job in the grocery stores because he could not put together a full-time schedule and reentered the formal education system in his 20s. Still, it wasn't their training that immediately protected them from broader economic phenomena such as automation and globalization. It was the seesawing tendency of the economy; it was patience; it was luck.

* * *

White-collar workers were luckier. Compared to their blue-collar counterparts, they were more buffered from the impact of the economic trends. Take the story of Nan DeMars. As a young child in Minnesota in the 1940s, DeMars spent hours in her home's attic, which her architect father had fashioned into a small office. There was a desk and a chair. Pencils and papers rested on the desk, flanking the black Smith Corona typewriter that belonged to DeMars' mother. This is where she practiced to become a secretary.

She took two years of shorthand courses in high school, after which she went on to get her bachelor's degree in business from the University of Minnesota. She then started working at Ringer Corporation, a lawn and garden company. Walking into the office one morning, a huge opened box greeted DeMars. As she edged closer to her desk, she peeked inside. A word processor, with a pink ribbon tied around it, looked up back at her. "I immediately gave it a girl's name, Elizabeth, and sat down to see what in the world it would do," DeMars told me. It was her responsibility to learn the new technology, which was not unusual for workers facing office automation during this time. She managed, in part, because of the studying skills she developed in college. Having successfully adapted to the automation, she kept her job.

Other secretaries weren't as fortunate. While president of the National Secretaries Association (now known as the International Association of Administrative Professionals), DeMars noticed that those who were unwilling to engage with the new computers, perhaps out of fear, did not keep up with technology and were demoted to lower-paying jobs such as file clerks. Indeed, Beverly Peterson, a secretary who worked at General Mills at the time, noted that an open mind was essential to job security in the field. Peterson's boss encouraged her to get a degree to protect herself from the ramifications of a changing workplace. He emphasized that "credentials really mean something in this world," said Peterson, and she "really took it to heart." Both DeMars and Peterson were able to hold onto their jobs, in part because their college educations prepared them to learn and keep up with new technologies, but also because they'd developed bonds with their devoted employers. Those who did not attend college, or who were marginalized in other ways, namely women of color, suffered.

These lessons about blue- and white-collar workers still apply. Amid a rapidly growing and evolving economy, the country faces a similar divergence between the fortunes of the two kinds of workers. Short-term job opportunities will become even shorter-term as newly gained skills deteriorate in value more quickly than they did in the '80s. The methods for obtaining new skills are more expensive than they were then. White-collar workers may be able to adapt, but the blue-collar folks trapped in a job landscape where all skills have an expiration date will be hit hard.

Training is undeniably crucial: When the economy settles down from the turmoil to come, workers will need it to be ready to reenter the workforce. The problem is: Federal programs have since the 1930s consistently been the default training choice for alleviating the burdens of innovation. As soon as one federal job-training program expires, another replaces it—despite history showing that large-scale efforts are all but impossible to pull off.

36 TV Shows to Watch in 2018

Posted: 04 Jan 2018 04:00 AM PST

If 2017 welcomed a fleet of film mainstays to the small screen, 2018 is further bolstering television's status as the medium of choice for actors and directors. The next three months feature upcoming series created or produced by Steven Soderbergh, Cary Fukunaga, Alex Gibney, and Ridley Scott, along with shows featuring up-and-coming talent such as Lena Waithe, Yara Shahidi, and Maya Thurman-Hawke. The rest of the year promises stars (Benedict Cumberbatch, Penélope Cruz, John Krasinski), luminaries (Anna Deavere Smith, David Attenborough), and revivals (The X-Files, Roseanne).


Fox

9-1-1 (January 3, Fox)

Ryan Murphy and Brad Falchuk (American Horror Story) turn to the humble procedural for their newest series, a fast-paced drama about first responders in Los Angeles. Angela Bassett, Connie Britton, and Peter Krause star.

Freeform

grown-ish (January 3, Freeform)

Freeform's spinoff of ABC's black-ish, also created by Kenya Barris, follows Zoey (Yara Shahidi) as she heads to the fictional Southern California University. The cast includes black-ish's Deon Cole, as well as the YouTube stars and Beyoncé protégés Chloe and Halle Bailey.

Fox

The X-Files (January 3, Fox)

The 11th season of Chris Carter's paranormal thriller will be Gillian Anderson's last as Dana Scully, but fans at least have 10 episodes to console themselves with. If the 2016 revival was anything to go by, expect tinfoil-hat conspiracy theories bookending a handful of standout stories.

Netflix

The End of the F***ing World (January 5, Netflix)

The first semi-surprise drop of the year is this import from Britain's Channel 4, a pitch-black dramedy about two teens running away from their unhappy homes. One is a misanthrope (Penny Dreadful's Jessica Barden); the other (Black Mirror's Alex Lawther) is a psychopath whose ambitions include murder.

Netflix

Rotten (January 5, Netflix)

Netflix's new six-part documentary series tackles the weighty subject of corruption, waste, and abuse in the food-production industry, from the unpalatable truth about chicken farming to the criminal exploitation of fishing in the Northeast.

Showtime

The Chi (January 7, Showtime)

Lena Waithe, an Emmy winner for her writing on the Netflix series Master of None, is the creator of this drama set on Chicago's South Side. Executive produced by Common, the show delves into the human stories of a region that's frequently reduced to crime statistics.

HBO

David Bowie: The Last Five Years (January 8, HBO)

Francis Whately chronicled the seminal years of David Bowie's career in his film David Bowie: Five Years. This follow-up documentary considers another period of creative fruition for Bowie—the half-decade leading up to his death from liver cancer in 2016.

Freeform

Alone Together (January 10, Freeform)

The comedians and actors Esther Povitsky and Benji Aflalo stars in this single-camera show about two millennials in Los Angeles in a committed but platonic relationship. The series is executive produced by The Lonely Island's Andy Samberg,  Akiva Schaffer, and Jorma Taccone.

Amazon

Philip K Dick's Electric Dreams (January 12, Amazon)

This Black Mirror–esque anthology series based on Philip K. Dick's short stories premiered on Britain's Channel 4 last year, and features actors including Bryan Cranston, Anna Paquin, Janelle Monae, Steve Buscemi, and Terrence Howard in an array of strange, futuristic fables.

The CW

Black Lightning (January 16, The CW)

Another DC Comics character gets his own TV show in Black Lightning, a series created for The CW by Mara Brock Akil and Salim Akil (Being Mary Jane). Cress Williams stars as a high-school principal who's also capable of generating and harnessing electricity.

FX

American Crime Story: The Assassination of Gianni Versace (January 17, FX)

The follow-up to Ryan Murphy's critically acclaimed autopsy of the O.J. Simpson trial explores the 1997 murder of the fashion designer Gianni Versace by the serial killer Andrew Cunanan. Édgar Ramírez plays Versace, with Penelope Cruz as his sister, Donatella. Ricky Martin and Darren Criss co-star.

Comedy Central

Corporate (January 17, Comedy Central)

Pat Bishop, Matt Ingebretson, and Jake Weisman co-created this dark satire about frustrated cogs (a.k.a. junior executives) in a corporate machine. Lance Reddick (The Wire) plays a a merciless CEO, with Aparna Nancherla as the overworked head of human resources.

BBC America

Blue Planet II (January 20, BBC America)

David Attenborough's follow-up to the groundbreaking 2001 documentary series The Blue Planet was voted the best series of 2017 by British viewers. Its stunning eight-episode investigation of marine life features a score by Hans Zimmer, who collaborates on one track with Radiohead.

Starz

Counterpart (January 21, Starz)

J.K. Simmons plays not one but two characters in Starz's new drama, a spy thriller about a U.N. operative in Berlin who discovers his agency is protecting a portal to a parallel dimension, and his alternate self is his only ally. Olivia Williams and Nazanin Boniadi co-star.

TNT

The Alienist (January 22, TNT)

Daniel Brühl, Luke Evans, and Dakota Fanning star in the new series from Cary Fukunaga (True Detective), an adaptation of the novel by Caleb Carr about a psychiatrist investigating a serial killer in 19th-century New York.

HBO

Mosaic (January 22, HBO)

Before Steven Soderbergh's new murder-mystery was a six-part series for HBO, it was an interactive, choose-your-own adventure style app in which viewers could elect which character's perspective to inhabit. Sharon Stone and Garrett Hedlund star in both versions.

Paramount

Waco (January 24, Paramount)

Anticipating the 25th anniversary of the Waco siege is Paramount's dramatic miniseries based on the 1993 battle between the FBI and the Branch Davidians. Taylor Kitsch plays David Koresh, with Michael Shannon, Melissa Benoist, Andrea Riseborough, and John Leguizamo rounding out the cast.

Netflix

Dirty Money (January 26, Netflix)

Alex Gibney (Going Clear, Enron: The Smartest Guys in the Room) helms this six-part documentary series investigating corporate malfeasance. Each episode boasts a different director, with Gibney on Volkswagen, Erin Lee Carr on Big Pharma, and Fisher Stevens on Trump Inc.

Netflix

One Day at a Time (January 26, Netflix)

Gloria Calderon Kellett's acclaimed update of the Norman Lear sitcom returns for a second season on Netflix, with Justina Machado, Rita Moreno, Isabella Gómez, and Marcel Ruiz reprising their roles as a tight-knit Cuban American family.

Netflix

Altered Carbon (February 2, Netflix)

The most-anticipated new Netflix series of early 2018 is probably this adaptation of the 2002 cyberpunk novel by Richard K. Morgan, with shades of Get Out and Black Mirror. Joel Kinnaman (The Killing, House of Cards) stars as a soldier in the future whose soul is implanted in a new body to help solve a murder.

Sundance

The Trade (February 2, Showtime)

Matthew Heineman (City of Ghosts, Cartel Land) directs Showtime's new five-part documentary series about the opioid epidemic, exploring its sprawling impact from Mexico to the U.S., and the growers, addicts, cartels, and law-enforcement officers who face it firsthand.

HBO

Notes From the Field (February 24, HBO)

One of the leading voices in documentary theater, Anna Deavere Smith, brings her one-woman play of the same name to HBO. The show investigates and dramatizes the playground-to-prison pipeline, with Smith interviewing more than 200 students, teachers, parents, and administrators, and recreating their stories.

NBC

Good Girls (February 26, NBC)

Christina Hendricks (Mad Men), Retta (Parks and Recreation), and Mae Whitman (Parenthood) star in NBC's new Monday-night sitcom about three suburban housewives with money troubles who decide to rob a supermarket.

USA

Unsolved (February 27, USA)

Anthony Hemingway, whose resume includes American Crime Story: The People v. O.J., executive produces USA's new anthology crime series, dramatizing the investigation into the murders of Tupac Shakur and Biggie Smalls. The cast includes Josh Duhamel and Jimmi Simpson, with Marcc Rose as Shakur and Wavyy Jonez as Smalls.

Hulu

The Looming Tower (February 28, Hulu)

Based on the Pulitzer Prize–winning book by the New Yorker staff writer Lawrence Wright, Hulu's new drama depicts how American agencies approached the rising threat of Osama bin Laden in the 1990s, amid a backdrop of intra-agency feuding. Jeff Daniels, Peter Sarsgaard, and Tahar Rahim star.

Hulu

Hard Sun (March 7, Hulu)

Neil Cross (Luther) writes and directs this "pre-apocalyptic" police drama, a Hulu/BBC co-production set in London about two officers (Agyness Deyn and Jim Sturgess) who uncover evidence that the end of the world is nigh.

Netflix

Marvel's Jessica Jones (March 8, Netflix)

The long-awaited second season of Melissa Rosenberg's series follows an all-too-fleeting Jessica Jones appearance in the lackluster The Defenders. This time around, Jessica (Krysten Ritter) seems compelled to investigate the origins of her powers, with David Tennant making at least a brief appearance as Kilgrave.

NBC

Rise (March 13, NBC)

Jason Katims (Friday Night Lights) and Jeffrey Seller (Hamilton) are the producers behind this musical adaptation of Michael Sokolove's 2013 nonfiction book Drama High. Josh Radnor (How I Met Your Mother) plays the music teacher invigorating a high-school theater department; Rosie Perez co-stars.

AMC

The Terror (March 26, AMC)

Executive produced by Ridley Scott, this 10-episode series is based on Dan Simmons's best-selling 2007 novel about a 19th-century naval mission in the Arctic, complete with starvation, mutiny, and a monster. Jared Harris, Tobias Menzies, and Ciarán Hinds star.

ABC / Instagram

Roseanne (March 27, ABC)

"Same couch, same cast, same laugh," is the tagline for ABC's revival of Roseanne, more than two decades after the series wrapped in May 1997. All of the primary cast members return, including both Beckys (Lecy Goranson and Sarah Chalke), thanks to some creative liberties.


Later in the Year:

Starz gets the American rights to the BBC's lavish adaptation of E.M. Forster's Howards End (April, Starz), starring Hayley Atwell and Julia Ormond. Another BBC import is Rellik (April, Cinemax), a police procedural that moves backward in time. In May, yet another highly anticipated BBC drama is the three-part adaptation of Louisa May Alcott's beloved Little Women (May 13, PBS), starring Emily Watson as Marmee and Maya Thurman-Hawke as Jo (Angela Lansbury also appears as Aunt March).

Coming later in the year is Melrose (date TBD, Showtime), a Sky Atlantic adaptation of Edward St. Aubyn's five Patrick Melrose novels, starring Benedict Cumberbatch. James Norton stars in McMafia (date TBD, AMC), a three-part miniseries with Russian and Israeli spy intrigue. And Tom Clancy's Jack Ryan (date TBD, Amazon) brings the super spy to the small screen in an adaptation starring John Krasinski.

There's No Way Congress Is Going to Fix Entitlements

Posted: 04 Jan 2018 03:00 AM PST

House Speaker Paul Ryan may think otherwise, but it's likely the tax bill that he helped drive through Congress last month has ruled out any serious effort to address the growing costs of federal entitlement programs for the elderly.

That's a problem, and not just for Republicans like Ryan looking to shrink the federal government. It's an issue for Democrats, too: They want to preserve crucial investments in younger generations, but to do so they'll eventually—and begrudgingly—need to impose some limits on the rising spending for seniors.

The clear message of recent political history is that the only way to implement such constraints is for both parties to link arms behind them. Yet the Republican tax cut, by enlarging the federal deficit by up to $2 trillion on a party-line vote, has made such a bipartisan agreement almost impossible to construct.

"The tax bill has made it more difficult both on the substantive and political side," said Robert Bixby, the longtime executive director of the Concord Coalition, a nonpartisan group focused on deficit reduction. "In taking a one-sided approach, the Republicans were able to get what they wanted in terms of a tax cut, but they made it much more difficult to get any entitlement reform."

The most immediate loser in that equation is Ryan. Retrenching federal entitlements has been his north star since he arrived in Congress in 1999. Throughout Ryan's career, the policy idea most often associated with him is converting Medicare into a "premium support" or voucher system. Under that approach, the federal government would no longer pay directly for seniors' health care, as it does now, and instead would provide them a fixed sum of money to purchase private insurance. (While premium support would still allow seniors to buy into the existing Medicare system itself, most analysts believe it would quickly grow unaffordable because only those with the greatest health needs would do so.)

The "Better Way" policy blueprint House Republicans issued under Ryan's direction in 2016 endorsed premium support. But the speaker has not pushed it to a vote. In interviews last month, Ryan signaled his determination to force a 2018 debate on restructuring Medicare and other big safety-net programs. "We [must] spend more time on the health-care entitlements, because that's really where the problem lies," Ryan insisted.

Ryan is right that federal spending for the elderly is on an unsustainable trajectory, mostly because American society is steadily aging. Through 2050, the Social Security trustees estimate that the number of seniors will rise from 48 million to 86 million. The Congressional Budget Office projects that, as a result, by 2047 Social Security and the major federal health-care programs, principally Medicare and Medicaid, will consume two-thirds of all federal spending (except for interest on the national debt). That's up from 54 percent now.

Conservatives have decried that future most vociferously, but it should concern Democrats, too. Increased spending on the elderly is already squeezing the resources available for investment in the productivity of future generations, such as education, scientific research, and infrastructure. The CBO projects that as spending grows on seniors, as well as on health care, federal discretionary spending—the portions of the budget that invest in future generations—will shrink relative to the economy.

That would not only be unfair to younger Americans, but it would also be counterproductive for the old: Seniors need more of today's diverse youth population to reach the middle class so they can pay the payroll taxes that support Social Security and Medicare. And although some Democrats want to avoid any cuts to programs for the elderly while at the same time preserving discretionary spending, in a graying society that would require an increase in taxes and total federal spending to a level that Americans are highly unlikely to accept.

This dilemma's logical solution is a three-sided agreement to raise taxes, impose some constraints on retirement programs, and preserve investments in future generations. Almost all serious, bipartisan deficit-reduction proposals (such as the one from the Obama-era Simpson-Bowles commission) have offered some variation on that formula. All have recognized that the only way to convince Democrats to accept entitlement cuts is to couple them with tax increases, and vice versa for Republicans.

But on both substantive and political grounds, the GOP tax bill has obliterated that possibility. By increasing the debt so much, the bill may force a future Congress to raise taxes just to fill the huge, new hole—even before it can respond to the increased spending demands of a growing elderly population. And the decision to cut taxes undermines the political formula required to induce almost any Democrat to consider supporting changes in entitlement programs: spending reductions balanced with tax hikes. "You have to have cooperation and trust across the aisle," Bixby said. "That's why you need a mix of spending cuts and revenue increases."

Ryan is now bugling for House Republicans to charge the hill for entitlement cuts without Democratic cover. But with Senate Majority Leader Mitch McConnell already publicly skeptical, Bixby and others think it's unlikely Ryan can spur a serious party-line drive to restructure Medicare, much less Social Security, the two biggest expenditures.

It's more likely any GOP drive to control "entitlements" will focus on programs that largely benefit lower-income families, including Medicaid and food assistance. But that effort will confront the same challenge as the attempt to repeal the Affordable Care Act: Particularly in the Rustbelt states that decided the 2016 election, the principal beneficiaries of those programs are the blue-collar whites at the core of the Republican coalition. By acting alone on taxes, Republicans have insured that they must act alone on entitlements, too. And that means any meaningful fiscal and generational rebalancing of the federal budget will have to wait—likely for many years to come.

The Death of 'Trumpism Without Trump'

Posted: 03 Jan 2018 07:28 PM PST

President Trump's savage excommunication of his former chief strategist Steve Bannon Wednesday has left the movement that carried him to power at a crossroads.

Throughout his time at the helm of Trump's campaign and inside the White House, Bannon had cultivated an image as the ideological leader of the Trump base—a reputation he retained even after leaving the White House and resuming his role as chairman of Breitbart News. He had continued to talk to the president from time to time, and traveled the country making speeches promoting his agenda.

But comments he made to the journalist Michael Wolff in a new book coming out this month—calling Donald Trump Jr.'s Trump Tower meeting with a Russian lawyer "treasonous" and speculating about the Mueller investigation's targets—enraged Trump. After excerpts from the book began circulating online, the White House released a statement from Trump saying Bannon had "lost his mind," and accusing him of leaking to the press throughout his time in the administration.

From the moment Bannon left the White House last year, his stated mission was clear: expanding the coalition that elected Trump into a lasting ideological movement that would remake American politics. Their split tests the limits of both men's influence within their shared base.  It also calls into question whether Trump-style nationalism has a future, or whether it starts and ends with Donald Trump.

Hoisting high the banner of "Trumpism without Trump," Bannon pledged a "season of war" against the Republican establishment, and set about recruiting populist outsiders to challenge GOP incumbents in the 2018 primaries. By flooding Capitol Hill with Trumpian torchbearers, Bannon believed, he would empower the president to make good on campaign promises like building a border wall, while also changing the DNA of the Republican caucus. That he would make life more difficult for Senate Majority Leader Mitch McConnell was an added benefit.

Meanwhile, Bannon has begun to work on creating an political infrastructure for his ideological movement. In November, he announced the formation of a new group that would promote a nationalist approach to trade, immigration, and foreign policy. (The group has not yet formally launched.) He used Breitbart News to amplify allies, attack adversaries, and shape new political narratives, all while coaching a new generation of conservative journalists in his combative style. He even showed a willingness to reach out to center-left writers and intellectuals—such as The American Prospect's Robert Kuttner, in the most notorious previous example of Bannon getting himself in trouble in an interview—perhaps believing that he could seduce them into joining the post-partisan populism he envisioned.

While many doubted Bannon's sincerity—detractors have long dismissed him as a cynical opportunist—his project was not without precedent. When, half a century ago, the conservative movement began taking control of the Republican Party, it was largely thanks to a network of think tanks, pressure groups, magazines, and commentators who had spent years articulating and popularizing their ideas. Bannon seemed to recognize that without a similar foundation, the unorthodox brand of populist nationalism that Trump campaigned on would struggle to achieve its goals, and be uprooted from Republican politics the moment the president left office.

But with his kneecapping of Bannon Wednesday, the president made clear that Trump—and not Trumpism—was still the main event, and that any effort to take the spotlight away from the Oval Office would be met with a swift and severe punishment. Amid the fallout of this high-profile political divorce, the most urgent question may be who will retain control of Trump's core base of supporters.

Ben Shapiro, a former Breitbart News editor-at-large who left the site in 2016 over his disagreements with management, said Bannon's efforts to "graft a nationalist populist philosophy onto Trump" were always doomed to fail.

"Bannon was delusional," he said. "There is no Trumpist movement, there is just Trump."

Shapiro pointed to Breitbart's own comments section—where hardcore fans were re-pledging allegiance to Trump on Wednesday afternoon and voicing their displeasure with Bannon—and said the president had little to fear from a potential backlash. "The White House has given conservatives more good policy in the past six weeks than they ever did when Bannon was there," Shapiro said, citing the passage of the tax reform bill, the ongoing appointment of appellate court judges, and the declaration of Jerusalem as Israel's capital. And while Shapiro has been an outspoken critic of Trump over the past two years, he said he was thrilled by the president's statement disowning Bannon.

"It was spot-on in every conceivable way," he said. "It is the truest thing Trump has ever said about any issue in any statement, ever."

Others, however, continue to believe that Trump's message has always been bigger than Trump himself. Ann Coulter, the right-wing commentator who is as much a leader among Trump's core supporters as anyone else, said the "cat fight" between the president and Bannon will have little effect on his most ardent supporters.

"Unlike so-called conservative intellectuals, Trump's base loved him BECAUSE OF HIS IDEAS," Coulter wrote in an email. "Specifically: 1) immigration; 2) trade; and 3) no more pointless wars." She continued, "Except for a few idiots, it's never been about Trump's personality or celebrity. It certainly isn't about Bannon's personality or celebrity. So there's no reason for the base to take sides here."

Similarly, Sam Nunberg, a former Trump adviser and a Bannon ally, said the base's support for the president would continue to be "issues-centric."

"If you had to ask them [who they'll side with], they're going to side with the president of the United States," he said. "But does that mean the president can pass DACA [without consequence]? Does that mean the president can become Mitch McConnell's water boy? The answer's no."

Bannon and his allies were mostly silent for the first half of Wednesday; he declined a request for comment. Breitbart reported the news of Trump's statement straight, but did offer something of a defense of Bannon by noting that his foes, the "Never Trump" conservatives, had rejoiced at his falling-out with Trump. One Bannon ally, speaking on condition of anonymity in order to talk frankly about internal discussions, argued that the White House's reaction to Bannon's view of the Russia matter was suspiciously vehement. "It does seem like the White House is protesting too much," this person said. As for the base, "all the president needs to worry about in terms of losing his base is DACA. He doesn't need to worry about comments in books, throwaway lines or whatever."

One person who spoke with Bannon on Wednesday, who requested anonymity in order to talk about a private conversation, said that Bannon seemed "blasé" about the matter and "didn't seem to care at all," and did not deny making the comments.

The situation puts pressure on Bannon's chosen candidates to lead his "season of war" in the midterms. The McConnell-aligned Senate Leadership Fund has already begun using Bannon's remarks against Trump as a cudgel against the candidates Bannon endorsed, as have some of their opponents.

So far, none of those candidates are exactly leaping to Bannon's defense. The campaign of Kelli Ward, who is running for Senate in Arizona and for whom Bannon has personally campaigned, released a lukewarm statement.

"Steve Bannon is only one of many high-profile endorsements Dr. Ward has received," Ward's spokesman Zachery Henry said, referring to the feud as the "daily parlor intrigue in Washington D.C."

Another Bannon pick, West Virginia Senate candidate Patrick Morrisey, went further on Wednesday.

"Patrick Morrisey has been endorsed by many conservatives throughout West Virginia and America because of his strong conservative record," said Morrisey spokesperson Nachama Soloveichik. "Attorney General Morrisey does not support these attacks on President Trump and his family, and was proud to stand with President Trump in 2016 when they were both overwhelmingly elected in West Virginia and when he cast his vote for Trump in the Electoral College."

The flap comes at a difficult time for Bannon, who nearly alone among top Republicans stood by the side of Roy Moore amid a deluge of sexual-misconduct allegations and was rewarded for his efforts with Moore's embarrassing loss to the Democratic candidate, Doug Jones, in deep-red Alabama. Though Bannon entered that race on the opposite side of Trump, who endorsed Luther Strange in the primary, the two both ended up supporting Moore.

The loss appears to have stuck in Trump's craw. He mentioned it in his statement slamming Bannon: "Steve had very little to do with our historic victory, which was delivered by the forgotten men and women of this country. Yet Steve had everything to do with the loss of a Senate seat in Alabama held for more than thirty years by Republicans."

"Steve doesn't represent my base—he's only in it for himself," Trump said. His falling-out with his former chief strategist will test the extent to which that is true.

Trump Disbands His 'Voter Fraud' Commission

Posted: 03 Jan 2018 06:16 PM PST

President Trump announced he was disbanding his Presidential Advisory Commission on Election Integrity Wednesday, citing a desire not to engage in "endless legal battles at taxpayer expense."

The genesis of the commission was the president's baseless claim that "millions" of illegal votes had cost him the popular-vote victory in the 2016 election, when his Democratic rival Hillary Clinton won about three million more votes. Neither the commission nor the president ever turned up evidence supporting Trump's claim, but the panel became a source of  controversy as civil-liberties and voting-rights groups accused the body of violating privacy protections and transparency obligations. The panel faced a number of lawsuits from groups like the American Civil Liberties Union, the Brennan Center, the Electronic Privacy Information Center, and even a Democratic member of the commission itself.

As my colleague Vann Newkirk II wrote in September, the commission was  "dogged by allegations that its true purpose is not eliminating voter fraud, but instigating voter suppression." Critics of the commission pointed to the fact that its appointees included longtime advocates of restrictive voting laws, such as Kansas Secretary of State Kris Kobach, to argue that the panel merely existed to provide pretext for further voting restrictions.

Civil-rights groups celebrated the end of the commission. Dale Ho of the ACLU voting rights project called the panel "a sham from the start." Vanita Gupta, formerly the head of the Obama-era civil-rights division and head of the Leadership Conference on Civil and Human Rights, said the commission was a political ploy to provide cover for the president's wild and unfounded claims of mass voter fraud, and to lay the foundation to purge eligible voters from the rolls." Kristen Clarke, head of the Lawyer's Committee for Civil Rights, one of the groups that sued the commission, said the panel was "launched with the singular purpose of laying the groundwork to promote voter suppression policies on a national scale," adding that its disbanding was "a victory for those who are concerned about ensuring access to the ballot box."

Rick Hasen, an election law expert and law professor at University of California Irvine, wrote that the commission's demise was due in part to "the light of publicity and the intense scrutiny the press and many of us gave the Commission. If they were going to provide a predicate for voter suppression, they were not going to be able to do it without a fight."

There are lingering questions even now that the commission is disbanded. For one, while many states resisted sending their voters' personal information to the commission, others did, raising the question of what happens to the data that was collected.

"The commission may no longer exist, but what happened to the information?" asked Sherrilyn Ifill, head of the NAACP Legal Defense Fund, one of the groups that sued the commission. "That information should be removed or erased, it should not exist in the servers of commission members and officials in the White House."

Trump's statement also says that he "asked the Department of Homeland Security to review these issues and determine next courses of action."

"We will be making some efforts to learn more what directive the Department of Homeland Security has been given with regard to investigating voter fraud," Ifill said, "and what methods they will use to do whatever work they have been asked to do by the president."

<em>The Atlantic</em> Daily: ‘It’s Crazy, but There You Are’

Posted: 03 Jan 2018 03:31 PM PST

What We're Following

The 'Nuclear Button': President Trump responded to Kim Jong Un's boast of North Korea's nuclear program with a tweet that declared, "I too have a Nuclear Button, but it is a much bigger & more powerful one than his." While America's nuclear launch codes are actually contained in a briefcase, Trump's tweet summed up American strategists' long-held theory of nuclear deterrence. Yet Twitter—a platform that encourages glib and impulsive comments—adds an extra element of volatility to the delicate balance that deterrence requires. The president's comments could lead South Korea to decide that China is a more reliable ally. They could even, writes Eliot A. Cohen, be a sign of impending war.

Bannon Back-and-Forth:  A new book features quotes from Trump's former adviser Steve Bannon that describe the Trump campaign's July 2016 meeting with Natalia Veselnitskaya, a Russian lawyer, as "treasonous" and "unpatriotic." Trump responded with a harsh repudiation, claiming that Bannon "lost his mind" when he was fired. The high turnover in Trump's West Wing is notorious, but it's not uncommon for presidents to replace former campaign aides among their staff with more-experienced officials. Trump's just doing it a lot faster than others have.

Understanding  Trump: How to explain the president's unpredictable behavior? On the Atlantic Interview podcast, the journalist Maggie Haberman describes how Trump's experiences in New York City may have shaped his personality. And James Hamblin argues that it's time for a formal, independent evaluation of Trump's neurological health. Here's why.

Rosa Inocencio Smith


Snapshot

Firefighters work among power lines that have icicles hanging from them in the Bronx, New York, on January 2, 2018. See more icy winter scenes here. (Shannon Stapleton / Reuters)

Evening Read

Veronique Greenwood wonders: Why do living things sleep?

Ask researchers this question, and listen as, like clockwork, a sense of awe and frustration creeps into their voices. In a way, it's startling how universal sleep is: In the midst of the hurried scramble for survival, across eons of bloodshed and death and flight, uncountable millions of living things have laid themselves down for a nice long bout of unconsciousness. This hardly seems conducive to living to fight another day. "It's crazy, but there you are," says Tarja Porkka-Heiskanen of the University of Helsinki, a leading sleep biologist. That such a risky habit is so common, and so persistent, suggests that whatever is happening is of the utmost importance. Whatever sleep gives to the sleeper is worth tempting death over and over again, for a lifetime.

Keep reading here, as Greenwood joins an international team of scientists trying to solve the mystery of sleep.


What Do You Know … About Science, Technology, and Health?

Donald Trump's latest incendiary tweet about nuclear war with North Korea seems like a clear threat of violence. But when Twitter users reported it, the company responded that "no violation" had occurred, which suggests that Trump is exempt from new Twitter policies banning such threats. In another tweet, from last week, Trump pointed to record-breaking low temperatures as proof that global warming is not a threat to the planet, suggesting that he might not even be interested in understanding climate change.

Can you remember the other key facts from this week's science, tech, and health coverage? Test your knowledge below:

1. ____________ made the earliest literary reference to a telephone in an 1878 short story for The Atlantic.

Scroll down for the answer, or find it here.

2. In November, a woman gave birth to a baby who had been frozen as an embryo for ____________ years.

Scroll down for the answer, or find it here.

3. The first gene therapy to treat inherited ____________ could cost people as much as $1 million.

Scroll down for the answer, or find it here.

Rachel Gutman

Answers: mark twain / 24 / blindness


Look Back

In honor of The Atlantic's 160th anniversary, we're sharing one article every day to mark each year of the magazine's history. From 1897, W. E. B. Du Bois describes the internal conflict of his African American identity:

It is a peculiar sensation, this double-consciousness, this sense of always looking at one's self through the eyes of others, of measuring one's soul by the tape of a world that looks on in amused contempt and pity. One feels his two-ness,—an American, a Negro; two souls, two thoughts, two unreconciled strivings; two warring ideals in one dark body, whose dogged strength alone keeps it from being torn asunder. The history of the American Negro is the history of this strife,—this longing to attain self-conscious manhood, to merge his double self into a better and truer self. In this merging he wishes neither of the older selves to be lost. He does not wish to Africanize America, for America has too much to teach the world and Africa; he does not with to bleach his Negro blood in a flood of white Americanism, for he believes—foolishly, perhaps, but fervently—that Negro blood has yet a message for the world. He simply wishes to make it possible for a man to be both a Negro and an American without being cursed and spit upon by his fellows, without losing the opportunity of self-development.

Read more here, and see more from our archives here.


Reader Response

After Conor Friedersdorf called for readers' recollections of 1968, Jeffrey C. Wray remembers the day from that year when his father was murdered:

After a few minutes, my brother Joe asked [our mother] what was going to happen to us. [She] pulled the three of us in tight and said, "I don't know. I don't know what is going to happen to us."

That spring and summer were rough-and-tumble days of protests and movements; and brutal, violent assassinations of Martin Luther King in April and Robert Kennedy in June. We felt all of the stress, tension, and energy of the times in our small town and in our black neighborhood within that small town. It was in those times—in that climate—that my mother had to decide what to do next in her life and for us, her children.

Here's what Wray's mother did.


Verbs

Stories rejected, modernity sought, saga continued, mystery explained.


Time of Your Life

Happy birthday to Natalie (twice the age of Pokémon); to Renee (a year younger than Disneyland); and to Kathy (twice the age of MTV).

Do you or a loved one have a birthday coming up? Sign up for a birthday shout-out here, and click here to explore the Timeline feature for yourself.


Meet The Atlantic Daily's team here, and contact us here.

Did you get this newsletter from a friend? Sign yourself up here.

<i>The Atlantic</i> Politics & Policy Daily: Bannon Fodder

Posted: 03 Jan 2018 02:35 PM PST

Today in 5 Lines

Steve Bannon called a meeting between the Trump team and a Russian lawyer "treasonous" in a forthcoming book. President Trump responded to his former chief strategist's comments in a statement, saying, "When he was fired, he not only lost his job, he lost his mind." Two new Senate Democrats—Doug Jones of Alabama and Tina Smith of Minnesota—were sworn in. North Korea reopened a communication line with South Korea for the first time in two years. And former Trump campaign chairman Paul Manafort filed a lawsuit against Special Counsel Robert Mueller.


Today on The Atlantic

  • Twitter Ban: Trump's tweet taunting North Korea Tuesday is just the latest example of why Twitter should ban world leaders from its platform, argues Conor Friedersdorf.

  • Is There Something Wrong With Trump?: James Hamblin makes the case for a streamlined system that could assess a president's mental fitness to lead.

  • The Upshot of Instability: The heavy turnover among staff during the Trump administration is much higher than that of Trump's predecessors, but pushing campaign staffers out of the White House isn't necessarily a bad thing. (David A. Graham)

Follow stories throughout the day with our Politics & Policy portal.


Snapshot

Vice President Pence ceremonially swears in Senator-elect Doug Jones following the official taking of the oath of office in the Old Senate Chamber on Capitol Hill. Jonathan Ernst / Reuters


What We're Reading

The Plan to Lose: Donald Trump never actually wanted to be president, writes Michael Wolff in his forthcoming book: On Election Night, Trump "looked as if he had seen a ghost. Melania was in tears—and not of joy." (New York)

A Transforming Senate: Congress is back in session, with two new faces: Democrat Doug Jones, who has shifted the GOP's control of the Senate to one seat, 51-49, and Minnesota Democrat Tina Smith, who raises the number of women in the Senate to a record high of 22. (Jessica Taylor, NPR)

Trump Drives a Wedge: Here's how Mike Pence and Jeff Flake, who were "once ideological soulmates and indivisible comrades," became proxies for the extreme reactions to Trumpism in the GOP. (Tim Alberta, Politico)

'Time to Stop Chasing Rabbits': Fusion GPS founders Glenn R. Simpson and Peter Fritsch share what they told Congress about the so-called Steele dossier. (The New York Times)

The Problem of the Elder Statesman: If Mitt Romney becomes the next senator from Utah, argues Erick Erickson, it could bring an end to the Republican Party. Here's how. (The Resurgent)


Visualized

Where the Green Cards Go: While the Trump administration has been criticizing "chain migration" in recent weeks, Democrats argue that many diagrams used to illustrate a family-based immigration system don't reflect reality. (Nick Miroff, The Washington Post)


Question of the Week

The Senate's longest-serving Republican, Orrin Hatch, announced on Tuesday that he'll retire at the end of his term, which opens the door for former Massachusetts Governor Mitt Romney to run for his seat. But what would a Senator Romney look like? The Atlantic's McKay Coppins asks: "Would he see himself as an anti-Trump truth-teller defending conservative principles from the poison of Trumpism? Or would he try to use his influence to pass major Republican legislation?"

This week, we want to know: Should Romney run? And if he did, what would you expect from him as a senator?

Share your response here, and we'll feature a few in Friday's Politics & Policy Daily.

-Written by Elaine Godfrey (@elainejgodfrey) and Lena Felton (@lenakfelton)

How are we doing? Send questions or feedback to egodfrey@theatlantic.com.

The Quiet Exuberance of Winter

Posted: 03 Jan 2018 01:53 PM PST

"You have to be at peace with the fact that something might happen, and you might not make it through," says Alexandra de Steiguer, the caretaker for the Oceanic Hotel, in Brian Bolster's short documentary, Winter's Watch. De Steiguer has spent the past 19 winters tending to the 43-acre grounds of the hotel, on Star Island, which sits 10 miles off the coast of New England. In the long, wintry off-season, she is the island's sole inhabitant.


Winter's Watch explores de Steiguer's relationship to extreme isolation. Its meditative imagery contemplates the beauty of absence, while de Steiguer reflects on the unique challenges and rewards of solitude. "There are no other distractions," she says. "You have to decide how to fill your days….and yet it is peaceful, and I can use my imagination."


The hulking—and possibly haunted—hotel bears a striking resemblance to The Shining, but de Steiguer maintains that "if there are ghosts out here, they are being extremely kind to me." Rather, she has embraced what she calls "the great waiting of winter."


"Being alone here and seeing the struggle of winter makes me feel connected to the web of life," she says. "Winter has a quiet exuberance. You have to look into the bones."


Why Trump Turned on Steve Bannon

Posted: 03 Jan 2018 12:37 PM PST

International crises will have to wait. President Trump has a new and even more infuriating nemesis on his hands than even North Korea's Kim Jong Un: his own former campaign CEO and chief White House strategist Steve Bannon.

Wednesday morning, The Guardian and NBC News published reports based on a book by Michael Wolff, and New York published a juicy, page-turner of an excerpt from the volume. They all add up to a damning image of the Trump administration and especially its leader. The president is depicted as out of touch, borderline illiterate, disrespected by his own closest advisers, sloppy with information, and horrified at winning the election. As I wrote, the most damaging revelations are those that come from Bannon—a Frankenstein's monster that Trump created, elevating him as an insider and failing to see the danger that posed to himself.

Early Wednesday afternoon, the president fired back at Bannon with some of the more scorching comments he has made publicly—impressive for a president in love with invective. In a written statement distributed by Press Secretary Sarah Sanders, Trump unloaded:

Steve Bannon has nothing to do with me or my Presidency. When he was fired, he not only lost his job, he lost his mind. Steve was a staffer who worked for me after I had already won the nomination by defeating seventeen candidates, often described as the most talented field ever assembled in the Republican party.

Now that he is on his own, Steve is learning that winning isn't as easy as I make it look. Steve had very little to do with our historic victory, which was delivered by the forgotten men and women of this country. Yet Steve had everything to do with the loss of a Senate seat in Alabama held for more than thirty years by Republicans. Steve doesn't represent my base—he's only in it for himself.

Steve pretends to be at war with the media, which he calls the opposition party, yet he spent his time at the White House leaking false information to the media to make himself seem far more important than he was. It is the only thing he does well. Steve was rarely in a one-on-one meeting with me and only pretends to have had influence to fool a few people with no access and no clue, whom he helped write phony books.

We have many great Republican members of Congress and candidates who are very supportive of the Make America Great Again agenda. Like me, they love the United States of America and are helping to finally take our country back and build it up, rather than simply seeking to burn it all down.

That's a lot to digest. Bannon's role in the Trump victory is a matter of dispute; Bannon certainly believes he played a large role, and thanks to his often-cozy relations with reporters, he was able to spread that story, irking Trump as early as spring 2017. In other words, Trump is right to point a finger at Bannon for his leaking, as well as for his hot-and-cold relationship with the mainstream media.

It is true that Trump had already won the primary when Bannon joined the campaign. It is also true that the Trump campaign was sputtering and faltering when Bannon arrived—Wolff reports, in fact, that on election night, no one on the campaign, including the candidate, expected to win. But presidential wins are never attributable to a single person or factor, and there are many reasons Trump won.

When Bannon was fired, in August, he promised to go to war, but at the time he portrayed this as a war against Trump's enemies, fought from the outside, and told The Weekly Standard's Peter Boyer that the president had encouraged him. "I said, 'Look, I'll focus on going after the establishment,'" Bannon said. "He said, 'Good, I need that.' I said, 'Look, I'll always be here covering for you.'"

Since then, however, the two men have found themselves increasingly at odds, though the president reportedly continues to speak with Bannon from time to time, at least by Bannon's account. Although the White House was bracing for Wolff's book—like so many Trump administration crises, this one was self-inflicted, with Trump inviting Wolff into the West Wing—well-sourced White House reporters said that Bannon's decision to attack members of the Trump family openly, on the record, inspired unusual fury from the president.

So the president has decided to make an enemy of Bannon with the searing statement Wednesday. The diction and tone bear little resemblance to Trump's own patois, and thus were probably written by an aide. There are upsides and downsides to picking a fight with a guy who espouses "#WAR" and brags that like the honey badger, he don't give a shit. Bannon has little compunction about fighting scorched-earth battles, and he is back atop Breitbart, a central node of Trump supporters. Yet Trump no longer needs Breitbart as he once did—he is now president of the United States—and as the statement noted, Bannon's record of backing candidates since he left the White House does not inspire confidence in his generalship. Moreover, by taking the offensive, Trump seeks to seize back control of the news cycle, shifting the focus away from Wolff's reporting and to his statement.

While the tone of Trump's statement is jarring, the maneuver of distancing himself is old hat. Each time a former staffer causes a problem for Trump, he pretends they played no role, that he barely even knew them. Trump exalted Bannon early on, but soon felt envious of the praise he was getting. "I'm the one who makes the decisions," he told The New York Times in November 2016. After Bannon landed on the cover of Time magazine, a space Trump cherishes, he began making critical comments. During an April press conference, the president foreshadowed Wednesday's statement. "I like Steve, but you have to remember he was not involved in my campaign until very late. I had already beaten all the senators and all the governors, and I didn't know Steve," he said. "I'm my own strategist and it wasn't like I was going to change strategies because I was facing crooked Hillary." In an interview with The Wall Street Journal, he called Bannon merely "a guy who works for me."

After Bannon was pushed out of the White House, Trump praised him on Twitter, though the comment was carefully phrased to avoid giving him too much credit:

In October, Trump expressed sympathy with Bannon's criticism of the Republican-led Congress. "I know how he feels," the president said, to the annoyance of GOP leaders. But after Roy Moore lost the U.S. Senate race in Alabama, many Republicans blamed Bannon for pushing Moore as the GOP candidate. Then came Wednesday's rupture.

Bannon is hardly the first person to get this treatment. Paul Manafort chaired Trump's campaign for three crucial months, including the period when he clinched and accepted the Republican nomination, but in February 2017, Trump said, "Paul Manafort was replaced long before the election took place. He was only there for a short period of time." A month later, Press Secretary Sean Spicer was widely mocked for saying that Manafort "played a very limited role for a very limited amount of time." (Manafort has since been indicted for laundering $75 million; Trump and his lawyers noted the alleged behavior was not related to the campaign.)

Something similar happened to Michael Flynn, once a close friend and key campaign surrogate of the president's. Trump named Flynn national-security adviser shortly after the election, despite warnings not do so from aides and from President Barack Obama. Later, after Flynn had lied to both Vice President Mike Pence and the FBI about conversations with the Russian ambassador, Trump's spokespeople praised Flynn's character and service, and according to sworn congressional testimony, Trump asked then-FBI Director James Comey to pull back an investigation into Flynn. Yet when Flynn recently pleaded guilty to lying to the FBI, Trump lawyer Ty Cobb dismissed him as "a former Obama administration official."

When George Papadopoulos joined the Trump campaign as a foreign-policy adviser, he included him in a small meetings and singled him out for praise during a meeting with The Washington Post editorial board, calling him "an excellent guy." After Papadopoulos pleaded guilty to lying to FBI agents, however, the president dismissed him as a "young, low level volunteer" who "few people knew."

The president's desire to distance himself from this motley assortment is understandable: Who wants to be associated with a pair of felons, an accused money-launderer, and a harsh critic? Trump is not the first president to try to create separation from an aide-turned-critic. In his memoir Decision Points, George W. Bush didn't mention Scott McClellan, a former press secretary who wrote a highly critical book. Bush's response was similar in substance, though not tone, to Trump's. "He was not a part of a major decision," he said. "This is a book about decisions. This isn't a book about, you know, personalities or gossip or settling scores. I didn't think he was relevant."

Yet this distancing, especially when conducted repeatedly as Trump has done, is self-evidently flimsy. How many close aides can one credibly disavow? Soon the denials become both difficult to credit and also begin to look like evidence of poor character judgment in the hiring process. Trump currently faces a perilous special-counsel investigation, and at least one former aide, Flynn, has agreed to cooperate with Robert Mueller. In such a situation, maintaining the loyalty of one's close aides is important. His insistence that Bannon and all the others were never more than bit-players is the latest reminder for current staffers that Trump views loyalty as a one-way street.

It's Colder Than Hoth Out Here

Posted: 03 Jan 2018 01:32 PM PST

Mid-winter temperatures have gripped the Northern Hemisphere, and much of North America is currently caught in a deep freeze. Yesterday morning, parts of all 50 states were below freezing. Some are bundling up and making the most of the weather, skiing, fishing, or taking a polar bear plunge—while others are struggling to cope and working hard against the elements. As the East Coast of the United States prepares for an approaching storm scarily called a "bomb cyclone," I have gathered a collection of recent chilly images from across the North.

Ancient Infant's DNA Reveals New Clues to How the Americas Were Peopled

Posted: 03 Jan 2018 10:04 AM PST

Around 11,500 years ago, at a place that is now called the Upward Sun River, in the region that has since been named Alaska, two girls died. One was a late-term fetus; the other, probably her cousin, was six weeks old. They were both covered in red ochre and buried in a circular pit, along with hunting weapons made from bones and antlers. "There was intentionality in the burial ceremony," says Ben Potter from the University of Alaska at Fairbanks, who uncovered their skeletons in 2013. "These were certainly children who were well-loved."

Now, several millennia after their short lives ended, these infants have become important all over again. Within their DNA, Potter's team has found clues about when and how the first peoples came to the Americas.

They did so from East Asia—that much is clear. Today, Russia and Alaska are separated by the waters of the Bering Strait. But tens of thousands of years ago, when sea levels were lower, that gap was bridged by continuous land, hundreds of miles wide and covered in woodlands and meadows. This was Beringia. It was a harsh world, but you could walk across it—and people did.

The Upward Sun River infants, who have been named Xach'itee'aanenh T'eede Gaay" (Sunrise Girl-Child) and "Yełkaanenh T'eede Gaay" (Dawn Twilight Girl-Child) by the local indigenous community, were found at a crucial point along this route. Few human remains have been found from such a northerly or westerly part of the Americas, or from such an ancient time. "It's hard to impress upon you how rare they are," says Potter. "The window into the past that these children provide is priceless."

By analyzing the older infant's genome, Potter and his colleagues, including José Víctor Moreno Mayar and Lasse Vinner, have shown that she belonged to a previously unknown group of ancient people, who are distinct from all known Native Americans, past and present. The team have dubbed them the Ancient Beringians.

"We'd always suspected that these early genomes would have important stories to tell us about the past, and they certainly didn't disappoint," says Jennifer Raff from the University of Kansas, who was not involved in the study.

By comparing Xach'itee'aanenh T'eede Gaay's genome to those of other groups, the team showed that the Ancient Beringians and other Native Americans descend from a single founding population that started to split away from other East Asians around 36,000 years ago. They became fully separated between 22,000 and 18,000 years ago, and then split into two branches themselves. One gave rise to the Ancient Beringians. The other gave rise to all other Native Americans, who expanded into the rest of the Americas. Native Americans, then, diverged into two more major lineages—a northern and a southern one—between 14,600 and 17,500 years ago.

This story unequivocally supports the so-called Beringian standstill hypothesis, "which for a long time has been the dominant explanation for how people initially peopled the Americas," says Raff. This scenario says that the ancestors of Native Americans diverged from other East Asians at a time when ice was smothering the Northern Hemisphere. That left them stranded and isolated for millennia somewhere outside the Americas, for their eastward movements were blocked by a giant ice sheet that covered much of North America. Only when that sheet started melting, around 15,000 years ago, could they start migrating down the west coast of the Americas.

Xach'itee'aanenh T'eede Gaay's genome anchors this narrative in time, suggesting that the millennia-long pit stop took place between 14,000 and 22,000 years ago. It doesn't, however, say where those early peoples stood still.

In one scenario, they paused in Beringia itself and split into two lineages there. One, the Ancient Beringians, stayed put. The other eventually made it further east and south and gave rise to the other Native Americans. If that's right, "there was just a single migration of people from Asia who peopled the New World," says Connie Mulligan from the University of Florida. She and others have found further evidence for that idea, but "this study provides the final piece needed to prove there was only a single migration," she says.

But Potter prefers an alternative scenario in which the standstill took place further back in northeast Asia, and the Ancient Beringians split from other Native Americans there. Both groups then independently traveled into Beringia and subsequently into the Americas, perhaps by different routes or perhaps at different times.

Partly, this debate hinges on a controversial archeological site at the Bluefish Caves in Canada's Yukon Territory. A recent study says that animal bones from the site, which seem to bear traces of human cut-marks, are 24,000 years old. Raff accepts the Bluefish evidence; Potter doesn't. If the marks really were made by humans, and really are that old, people must have been in Beringia by that point, and likely paused there. If they're not ... the find doesn't really rule out either hypothesis.

Either way, both scenarios can now be tested with future data from either ancient DNA or archaeological finds. And both scenarios argue against an attention-grabbing study from last year which claimed that hominids were in North America 130,000 years ago, based on the bones of a mastodon that had supposedly been butchered with nearby stone tools. "I am super skeptical about that," says Potter. "Early modern humans aren't even out of Africa at that point, so you'd be talking about, I don't know, a Denisovan? And there are no Denisovans within 10,000 miles of that site."

It's also unclear what became of the Ancient Beringians. They have no obvious direct descendants, and the people who currently live at the Upward Sun River—the Athabascans—are descended from one of the other groups of Native Americans. It's possible that the Athabascans may carry traces of Ancient Beringian ancestry, but it's hard to say without analyzing their genomes.

Such work has had a troubled history. As I've written before, in the 1990s, Arizona State University scientists collected samples from the Havasupai tribe to study the genetics of diabetes but, without their knowledge, also used those samples to study schizophrenia, inbreeding, and migration patterns. When the Havasupai found out, they successfully sued the university for $700,000 and banned its researchers from their land.

Another bitter controversy surrounded the Ancient One—an 8,500-year-old skeleton that was discovered in Washington State, and became known as Kennewick Man in non-native circles. For almost two decades, five tribes pushed for the bones to be reburied, fighting against parties who disputed his native ancestry. After an analysis of his genome confirmed that he was indeed Native American, Barack Obama signed an order in December 2016 finally allowing him to be reburied. The five tribes were all invited to take part in future studies, but only the Colville tribe accepted. "We were hesitant," their representative James Boyd told The New York Times. "Science hasn't been good to us."

Some of the scientists involved in sequencing the Ancient One's genome also worked on the Upward Sun River study. "They've made progress in doing more consultative and consensual research," says Kim TallBear from the University of Alberta, who studies the intersection of race and genetics and is a member of the Sisseton-Wahpeton Oyate tribe. But she's also disinterested in the questions they are asking. "This type of research is done largely for the benefit of nonindigenous peoples," she says. They center a "settler-colonial narrative" about a "largely one-way migration story into the Americas and the idea that everyone is in some form an immigrant."

Indigenous peoples, TallBear says, have more complex narratives about their relationship with their lands, and their webs of obligation with each other and other animals. "I am interested in indigenous worldviews conditioning more scientific inquiry. What different questions might indigenous peoples ask of genomics?"

Potter says that he takes these concerns very seriously, and worked hard to keep a positive relationship with indigenous communities. Unlike in the case of the Ancient One, he made sure to get the support of the Athabascans before any work was actually done and any DNA was sequenced.

"I'm also interested in what they're interested in," he says. "What can we include in our analysis that we can give back to them?" For example, after learning how important salmon fishing is to the Athabascans, his team found evidence of humans using salmon at the Upward Sun River site—the earliest such evidence in the Americas. "The longevity of resource use in the past is highly relevant to people now," he says.

That's the kind of insight that TallBear is after: not into how people got there, but how they actually lived. And given the two dead infants, those lives were likely harsh. "We don't know the overall population but we can reasonably infer that it was relatively low—maybe 20 to 40 people," says Potter. "To have these children die over one or two summers, in the season with the most abundance of resources, tells us something about risky and delicate nature of life in the far north."

Waiting for the Bomb to Drop

Posted: 03 Jan 2018 03:22 PM PST

The decision to move the American embassy to Jerusalem makes a war in Korea more likely. Not because there is any direct connection between the two, nor because it was a bad idea, recognizing as it did the simple fact that the western part of Jerusalem has been Israel's capital for over 70 years and will most assuredly remain so. The dangerous bit, rather, was when pundits and diplomats wrung their hands and predicted calamity and (far more predictably) nothing happened. The Arab street grumbled, while Cairo, Riyadh, and Abu Dhabi looked the other way, and Donald Trump could be forgiven for thinking that his instincts had been proven entirely correct.

And therein lies the danger. As we can see from the irritable id that manifests itself in his tweets, Trump believes that he has been an exceptionally successful president, who deserves credit for the absence of deaths from plane crashes, a soaring stock market, and a tax cut rammed through on a completely partisan basis by a Republican majority that has thirsted for little else for decades. Those who made fun of his claims that he is smarter than his generals and that there is no need for a fully staffed State Department because he is around, or mocked his boasts that his nuclear button is bigger and better than Kim Jung Un's, should stifle their chuckles. This is serious. The president feels vindicated, smart, and self-confident beyond the outlandish egotism of his campaign days in 2016.

This is serious first and foremost because the North Korean threat is serious. National Security Adviser H. R. McMaster is correct when he notes that the North Koreans have always been willing to sell anything—literally anything—to anybody with the hard cash to buy it. That will be true of their nuclear weapons. It is indeed true that Pyongyang is on the verge of acquiring the ability to obliterate Los Angeles, and eventually Washington. It is certain that this regime has shown no respect for any international norms, let alone international law; that it has committed murder; that it lives in a psychotic cocoon of its own making; and that it will stop at nothing. And it is true, finally, that this dangerous circumstance is not of Trump's making: It is rather the consequence of policies that bought time and offered no idea what to do with the time that was purchased through shifting combinations of diplomacy, bribes, sanctions, and skullduggery.

Any administration faced with these facts, and at this technological moment in the North Korean program, would have weighed carefully the possibility of a preventive war —it would be the prudent strategic thing to do. And then that administration would have walked away from it. A deliberately initiated war still runs the risk of a humanitarian disaster because, as everyone now realizes, Seoul is within range of thousands of North Korean artillery tubes and rocket batteries. Hundreds of thousands of civilians, including American expats and dependents, would perish in the war that could be unleashed. Even assuming some magical technologies that enable the U.S. to disarm North Korea and decapitate its leadership, who is to say that the ensuing war would not have its way even so?

The consequences of preventive war—a war deliberately initiated by the United States or launched as a result of provocations by one side or both that then escalates—go far beyond this. South Korea, within the memory of people now living, has gone from being poorer than most African countries in 1950 to becoming a first-world technological and economic powerhouse. Could South Koreans forgive the Americans for the slaughter of their citizens and the devastation of their cities because of weapons aimed a hemisphere away? Would the Chinese meekly accept an American conquest of North Korea, or even simply the elimination of the Kim dynasty? Or are they more likely to pour troops, aircraft, and missiles into the Korean peninsula and to warn the Americans off? And where might that lead?

To judge by his public statements, McMaster, like United Nations Ambassador Nikki Haley, is hard over on the notion that North Korea has to be denuclearized, be it by peaceful surrender or by force. He has used the words preventive war on several occasions. In so doing he is, of course, echoing the president, but it is reasonable to think that he agrees with the basic idea. And that would not be entirely surprising: His duty is to ensure the security of the United States, and North Korea's intercontinental ballistic missiles would undoubtedly be a threat to that.

What the president's advisers may not fully appreciate are the political perils of taking such a hazardous course. The fact is that a majority of the American people seem to believe that many words coming out of the president's mouth are lies. That will not change when he sits behind the Resolute desk and tries to explain why he has launched a Korean war. Trump's advisers may think that their credibility can substitute for their boss's lack of it, but they are wrong there too, for, inevitably, their reputations for integrity have been tainted by their own Trumpian pronouncements. Abroad, some governments—Australia and Japan, for example—may feel compelled to side with the Americans. But they too will discover that their populations' mistrust and disgust toward the American president will undermine their participation in a war. And when all these forces come together, the political firestorms may sweep away long-standing international relationships as well as myriads of Korean and American lives.

There are sounds, for those who can hear them, of the preliminary and muffled drumbeats of war. The Chinese are reported to be preparing refugee camps along the North Korean border. Resources are being shifted to observe and analyze the North Korean military. Mundane logistical processes of moving, stockpiling, and updating crucial items and preparing military personnel are under way. Only the biggest indicator—the evacuation of American dependents from South Korea—has yet to flash red, but, in the interest of surprise, that may not happen. America's circumspect and statesmanlike secretary of defense, James Mattis, talks ominously of storm clouds gathering over Korea, while the commandant of the Marine Corps simply says, "I hope I'm wrong, but there's a war coming."

Maybe nothing will happen. Maybe Donald Trump, he of the five draft deferments during the Vietnam War, will flinch from launching a war as commander in chief, in which case the United States will merely suffer an epic humiliation as it retreats from as big a red line as a president has ever drawn.  Still, lots of people have an interest in war. For Russia, the opportunity to set the United States and China against each other over Korea is a dream come true. For narrow-minded American strategists, it is the only way of cutting the North Korean nuclear Gordian knot. For Kim Jong Un peeking over the edge of the precipice may cause South Korea to break with the Americans, or the Chinese to fight them. For Donald Trump it may be a moment of glory, a dramatic vindication of campaign promises, and an opportunity to distract American minds from Robert Mueller's investigation of his campaign's ties to the Russians. And so threats and bluster may turn into violent realities. And if they do, not tomorrow or the next day, but some time in 2018, a Second Korean War could very well make it one of those years in which history swings on its hinge.

The Essential Saga of Don Hertzfeldt's <em>World of Tomorrow</em>

Posted: 03 Jan 2018 09:08 AM PST

In Don Hertzfeldt's films, the unconscious can be a grand, terrifying playground, a vast sci-fi landscape of swirling vortexes, rainbow-colored clouds, and shiny oblong rocks strewn beside a black sea. Early in World of Tomorrow Episode Two: The Burden of Other People's Thoughts, Hertzfeldt's dizzying sequel to his Oscar-nominated 2015 short film World of Tomorrow, a little girl and her adult clone are exploring a virtual representation of the clone's brain. The two walk around a beach of memories, and the girl picks up a shiny object. "That is a glimmer of hope," her grown-up copy tells her. "Put it back."

You'd be forgiven for thinking this all sounds a little heady. But the delight of the World of Tomorrow series is the clarity of Hertzfeldt's ideas, and how powerfully they ring through the strange, knotty stories of the future he's depicting through a mix of stick-figure animation and surreal digital effects. The first World of Tomorrow runs 16 minutes, and the sequel (now available to rent on Vimeo) is 22 minutes long. Within those short timeframes, Hertzfeldt tackles how humankind will eventually contend with immortality, time travel, cloning, deep-space colonization, and the end of the world. Hint: We don't handle any of it very well.

World of Tomorrow, which is currently available to stream on Netflix, is extremely funny, hauntingly sweet, and dense with philosophical concepts. It follows the adventures of a child named Emily (Winona Mae), who is visited by her third-generation clone (Julia Pott), a woman from hundreds of years in the future who has had all of Emily's memories uploaded into her brain to guarantee a sort of immortality. The Emily-clone is an odd creature speaking in halting sentences, a copy of a copy trying to hang on to some semblance of a personality, where her revered ancestor (referred to as Emily Prime) is a bubbly kid who runs around and rambles about this and that.

Emily Prime was voiced by Hertzfeldt's four-year-old niece, her performance cobbled together from audio recordings he made of her drawing and playing. For Episode Two, Hertzfeldt repeated the process (when his niece was five years old), and found it more difficult to assemble a logical plot around her. "That madness of childhood imagination had fully taken hold and she was spouting endless monologues about fantasy worlds," he said in an interview. Perhaps that's why Episode Two lacks the relative simplicity of its forbear. But the challenge posed to Hertzfeldt helped him expand his world and dig into weirder new ideas, while maintaining the goofy energy of the first short film.

Emily Prime is still a giggling young girl mostly interested in explaining what she's drawing rather than the psychotic landscapes exploding around her. But this time she's visited by Emily-6, a clone of Emily's future clone from the last film. Emily-6 was created to exist only as a memory backup, but now serves no purpose, as her "sister" perished in an apocalyptic event. Yes, there's an ironic twist to Hertzfeldt making the subject of his sequel a further copy of a copy, but Emily-6 is even more odd and removed from reality than her counterpart was. She guides Emily Prime through the confused landscape of her own psyche, in hopes that her original self can untangle it.

This means Episode Two is less concerned with the life of Emily Prime, which was explored in the first entry, and more interested in digging into the broken future that creates these sad, memory-lacking clones. We watch Emily-6 recollect her experiences growing up in a test tube on a deep-space colony, bonding with other Emily clones, and obsessing over the hallowed life of their ancestor despite lacking any real-world context with which to understand it. It is a pitiful existence, even grimmer than that of the Emily clone of the first World of Tomorrow, but Hertzfeldt finds compassion even in these faded copies, with the happy-go-lucky Emily Prime convincing Emily-6 that her life is not without meaning.

After walking among her "glimmers of hope" with Emily Prime, Emily-6 recalls her most disturbing memory: squishing a bug as a child and realizing that it had no backup copies, no clones in storage who could carry on its life. "All of its experiences are gone forever. We can never know them," she tells Emily Prime. "If there is a soul, it is equal in all living things," she concludes, and that's the tale's larger point—that no matter how horrifying the future might be, even some twisted piece of humanity will still be ineffably alive.

Hertzfeldt's special brand of storytelling conjures incredible emotional depths even from the simplest creations. His characters are mere stick figures with basic cartoon faces, and their backgrounds (like Emily-6's test-tube creation) might be completely ludicrous and fanciful, but they're more lovable than most flesh-and-blood actors manage to be in live-action films. The idea of Emily-6 might be distressing to consider, but as a sci-fi heroine she's among the best the genre's ever seen.

It's ideal to watch World of Tomorrow's two episodes together, so the smaller details of Hertzfeldt's world-building make more sense. But Episode Two is no tired rehashing of previous concepts; it deserves the same cult status its predecessor quickly acquired. Hertzfeldt is still an artist working on the fringes of American animation, but he should be considered one of the medium's best storytellers. And though each future episode World of Tomorrow will never get the hype of a Star Wars movie, in any ranking of ongoing sci-fi franchises, the saga of Emily Prime should be near the top.

Steve Bannon Comes Back to Haunt Donald Trump

Posted: 03 Jan 2018 11:36 AM PST

Donald Trump shows little affinity for reading, but he is familiar with the conceit of Mary Shelley's Frankenstein: An ambitious figure creates a monster in the hopes of glorifying himself, only to have the spurned monster wreak havoc on its creator.

The president is living that plot, too, with Steve Bannon—and not only because his former consigliere sometimes displays the bedraggled, frightening mien of Dr. Frankenstein's creation. During the heat of the presidential campaign, Trump plucked Steve Bannon from Breitbart to be the chief executive of his campaign; after winning, he appointed Bannon chief strategist of his White House, then pushed him out in August.

Now Bannon is back to haunt Trump, most strikingly, in a new book by journalist Michael Wolff. The Guardian obtained a copy of the book, due out later this week, which is built on hundreds of interviews with the president and administration insiders. In the excerpts published by The Guardian, Bannon shows that even from outside the White House, he is more than able to sow the same chaos and backbiting that got him pushed out.

More materially, he hammers Trump's son Don Jr., son-in-law Jared Kushner, and then-campaign chair Paul Manafort for meeting with Russian lawyer Natalia Veselnitskaya at Trump Tower in June 2016.

"The three senior guys in the campaign thought it was a good idea to meet with a foreign government inside Trump Tower in the conference room on the 25th floor—with no lawyers. They didn't have any lawyers," Bannon said, per Wolff. "Even if you thought that this was not treasonous, or unpatriotic, or bad shit, and I happen to think it's all of that, you should have called the FBI immediately."

Bannon predicted that Special Counsel Robert Mueller would go after all three participants for money-laundering, citing Mueller's hiring of experienced financial-crimes prosecutor Andrew Weissmann. "They're going to crack Don Jr. like an egg on national TV," Bannon said.

In describing the Trump Tower meeting as "treasonous" and "unpatriotic," Bannon becomes the first major Trump insider to say what is at this point clear to anyone willing to look at the facts: Whether or not there were any crimes committed, Trump aides colluded with Russia. The pattern runs from George Papadopoulos's conversations with Russian agents, through the Trump Tower meeting, and up to Michael Flynn's conversations with then-Ambassador Sergey Kislyak, about which he has pleaded guilty to lying to FBI agents.

None of this proves that the Trump campaign committed a crime, nor that these actions determined the outcome of the election. But they do show that Trump's repeated insistence that there was no collusion isn't credible. For a time last summer, Trump and his defenders quit claiming there was no collusion, and adopted a new talking point: that collusion was entirely normal and proper. More recently, the president has returned to claiming there was no collusion.

"I think it's all worked out because frankly there is absolutely no collusion, that's been proven by every Democrat is saying it," he told The New York Times last week. (His claim about Democrats is false.) "So, I think it's been proven that there is no collusion."

The fact that Bannon dares call it treason is a powerful counter to this denial, and it's powerful because Bannon's name will forever carry the label "former White House chief strategist." As Dick Morris, Jeffrey Lord, Pat Caddell, and dozens of other mediocrities and washed-up operatives can attest, such an imprimatur can sustain a career for decades. By elevating Bannon to that title, Trump set himself up for pain. Before Trump hired him, Bannon had a checkered record—he'd done decently for himself financially, but he had an up-and-down business career and his proudest achievement was sitting atop Breitbart—a news organization that even he disparages in Wolff's book (saying a leak could go "down to Breitbart or something like that, or maybe some other more legitimate publication").

Bannon was able to help usher Trump to a victory in the election, though the portion of the credit he deserves is in dispute. An electoral win tends to have a thousand fathers, and in any case Trump felt Bannon often inflated his role. Once in the White House, the self-proclaimed Leninist proved to be just as much an agent of chaos as he had proudly been at Breitbart—pushing his own priorities, even when they put him at odds with the rest of the White House; fighting bitter internecine wars via leaks; and giving damaging interviews to liberal publications.

In August, after John Kelly became chief of staff, Bannon was finally pushed out. That took the temperature down on the feuds within the West Wing, but Bannon made clear he wasn't dropping his fight. "I feel jacked up," he told The Weekly Standard's Peter Boyer. "Now I'm free. I've got my hands back on my weapons. Someone said, 'It's Bannon the Barbarian.' I am definitely going to crush the opposition." Bannon was a hazard to White House stability within the administration, but he is showing that he can be just hazardous outside, too.

This is so even as there is little reason to impute any purity of motive to Bannon's comments. He appears to be driven in large part by the urge to extend his feud with his old White House nemesis Jared Kushner, the president's son-in-law and a senior adviser. In the process, he has contradicted himself. Late last year, he told Vanity Fair's Gabriel Sherman that he thought there was no collusion case, but he took a shot at Kushner anyway: "He's taking meetings with Russians to get additional stuff. This tells you everything about Jared. They were looking for the picture of Hillary Clinton taking the bag of cash from Putin. That's his maturity level."

Bannon also told Wolff his rival was susceptible to prosecution for financial crimes. "It goes through Deutsche Bank and all the Kushner shit. The Kushner shit is greasy," he said. "They're going to go right through that. They're going to roll those two guys up and say play me or trade me." Kushner's lawyer has denied any wrongdoing, and Bannon's prediction mirrors existing speculation in the press. Does he know any more than the rest of us? It's entirely possible he does not, but since Trump gave him that coveted former-White-House-staff status, it's harder to dismiss him out of hand than it is any other pundit.

Bannon, who had not joined the campaign when the June 2016 meeting occurred, has also criticized Trump for firing FBI Director James Comey. But he demonstrates his own cynicism to Wolff, condemning the Trump Tower meeting while also slyly explaining how he would have conducted such skullduggery—arranging a meeting far from the epicenter of the campaign, "in a Holiday Inn in Manchester, New Hampshire, with your lawyers who meet with these people." It's a rare man who condemns something as treasonous, then explains how he would have committed the treason more effectively. This doesn't speak much for Bannon's sincerity, but his statement remains important for what it is: a former top aide to President Trump labeling collusion as such.

Why Do People Refer to a Nonexistent 'Nuclear Button'?

Posted: 03 Jan 2018 03:02 PM PST

Asking if the nuclear button at President Trump's disposal is an actual button, as the president claimed on Twitter Tuesday, or merely a figurative term to describe the means by which a nuclear missile can be deployed is a bit like asking someone if they'd preferred to be shot or stabbed to death—a distinction without a difference. And yet here we are in the first week of the new year asking precisely that question.

It began Monday as Kim Jong Un, the North Korean leader, delivered his New Year's Day speech, where he offered the possibility of talks with South Korea to reduce tensions caused by his nuclear-weapons and missile programs. But he also delivered an ominous warning:  "The entire United States is within range of our nuclear weapons, a nuclear button is always on my desk. This is reality, not a threat."

Much of the subsequent analysis focused on what it would mean for the U.S., which has taken a tough line on North Korea in order to force it into talks, if its ally South Korea begins talks with the North with no preconditions. The U.S. wants the North to renounce its nuclear weapons before beginning any talks—a precondition viewed as unrealistic by regional experts. But it was the part about "a nuclear button" that apparently caught Trump's attention.

Notwithstanding the puerile, schoolyard-like taunt from Trump, his tweet referred to the "nuclear football," a series of launch codes contained in a briefcase that the president must enter in order to authorize a nuclear strike—one that no country has ordered since President Harry Truman dropped nuclear weapons on Japan to force it to surrender in World War II. (An early plan for nuclear war was codenamed "Dropkick." According to former defense secretary Robert McNamara, the Kennedy- and Johnson-era defense secretary, you need a "football" for a "dropkick.")

But the term "nuclear button" has been in use for decades. The earliest mention that I could find was from Lester Pearson's Nobel Peace Prize acceptance speech in 1957. Pearson, a former Canadian prime minister, won the prize for his role in resolving the Suez Crisis, but it was his "four faces of peace" speech in Oslo, in which he called for detente between the U.S. and the Soviet Union, for which he is often remembered. "Surely the glamour has gone out of war. The thin but heroic red line of the nineteenth century is now the production line," he said in Oslo (perhaps optimistically). "The warrior is the man with a test tube or the one who pushes the nuclear button. This should have a salutary effect on man's emotions. A realization of the consequences that must follow if and when he does push the button should have a salutary effect also on his reason."

The term's use continued through the Cold War. In the U.S., criticism of Senator Barry Goldwater's apparent openness to using nuclear weapons in Vietnam prompted a New York Times story on September 27, 1964, with the headline: "Controversy Grows On Who Controls Nuclear Button." But even as it slipped into common usage—according to Google Trends, which tracks such data, it has been in wide use this century and well before that, according to LexisNexis—the 1962 Cuban Missile Crisis ensured that it would take more than the pressing of a single button for a U.S. president to launch nuclear weapons. In 2014, former Washington Post reporter Michael Dobbs wrote in Smithsonian magazine that President John F. Kennedy was so "horrified by the doctrine known as MAD (mutually assured destruction), [he] ordered locks to be placed on nuclear weapons and demanded alternatives to the 'all or nothing' nuclear war plan." The result: the nuclear football, housed in a black briefcase carried by a military aide who accompanies the president.

"The Football does not actually contain a big red button for launching a nuclear war," Dobbs wrote. "Its primary purpose is to confirm the president's identity, and it allows him to communicate with the National Military Command Center in the Pentagon, which monitors worldwide nuclear threats and can order an instant response."

The term "nuclear button" might have outlived the Cold War, the fear of global destruction, "duck-and-cover" drills, and even its original antagonists, the U.S., the Soviet Union, but as other countries, such as India and Pakistan, began developing their own nuclear-weapons programs, the metaphorical "nuclear button" entered their lexicon of war, as it did in countries like Israel, which does not confirm or deny the existence of a nuclear program.

It's not known if Kim Jong Un possesses an actual nuclear button, as he claimed, or a metaphoric one—but he, like his father and grandfather before him, enjoys absolute power. Even if he doesn't have an actual button to order a nuclear strike, it's quite possible he has something like it—with fewer safeguards in place than in the more established nuclear-weapons states. It's that uncertainty that enhances the dangers of a "nuclear button"—the idea that annihilation can be unleashed with such ease by simply pressing a button.

Pete Souza, the White House photographer for Presidents Reagan and Obama who has used his Instagram account to showcase his work and his criticism of Trump, said after the president's tweet on Tuesday:

God help us.

A post shared by Pete Souza (@petesouza) on

In a subsequent post, he too noted that the "nuclear button" isn't, in fact, a button.

But that's little comfort for tens of millions of people if a nuclear warhead is hurtling toward a major city on the Korean Peninsula or the United States.

Is Something Neurologically Wrong With Donald Trump?

Posted: 03 Jan 2018 09:57 AM PST

President Donald Trump's decision to brag in a tweet about the size of his "nuclear button" compared with North Korea's was widely condemned as bellicose and reckless. The comments are also part of a larger pattern of odd and often alarming behavior for a person in the nation's highest office.

Trump's grandiosity and impulsivity has made him a constant subject of speculation among those concerned with his mental health. But after more than a year of talking to doctors and researchers about whether and how the cognitive sciences could offer a lens to explain Trump's behavior, I've come to believe there should be a role for professional evaluation beyond speculating from afar.

I'm not alone. Viewers of Trump's recent speeches have begun noticing minor abnormalities in his movements. In November, he used his free hand to steady a small Fiji bottle as he brought it to his mouth. Onlookers described the movement as "awkward" and made jokes about hand size. Some called out Trump for doing the exact thing he had mocked Senator Marco Rubio for during the presidential primary—conspicuously drinking water during a speech.

(Joshua Roberts / Reuters)
(Mark Wilson / Getty)

By comparison, Rubio's movement was smooth, effortless. The Senator noticed that Trump had stared at the Fiji bottle as he slowly brought it to his lips, jokingly chiding that Trump "needs work on his form. Has to be done in one single motion, and eyes should never leave the camera."

Then in December, speaking about his national-security plan in Washington, D.C., Trump reached under his podium and grabbed a glass with both hands. This time he kept them on the glass the entire time he drank, and as he put the glass down. This drew even more attention. The gesture was like that of an extremely cold person cradling a mug of cocoa. Some viewers likened him to a child just learning to handle a cup.

Then there was an incident of slurred speech. Announcing the relocation of the American embassy in Israel from Tel Aviv to Jerusalem—a dramatic foreign-policy move—Trump became difficult to understand at a phonetic level, which did little to reassure many observers of the soundness of his decision.

Experts compelled to offer opinions on the nature of the episode were vague: The neurosurgeon Sanjay Gupta described it as "clearly some abnormalities of his speech." This sort of slurring could result from anything from a dry mouth to a displaced denture to an acute stroke.

Though these moments could be inconsequential, they call attention to the alarming absence of a system to evaluate elected officials' fitness for office—to reassure concerned citizens that the "leader of the free world" is not cognitively impaired, and on a path of continuous decline.

Proposals for such a system have been made in the past, but never implemented. The job of the presidency is not what it used to be. For most of America's history, it was not possible for the commander in chief to unilaterally destroy a continent, or the entire planet, with one quick decision. Today, even the country's missileers—whose job is to sit in bunkers and await a signal—are tested three times per month on their ability to execute protocols. They are required to score at least 90 percent. Testing is not required for their commander in chief to be able to execute a protocol, much less testing to execute the sort of high-level decision that would set this process in motion.

The lack of a system to evaluate presidential fitness only stands to become more consequential as the average age of leaders increases. The Constitution sets finite lower limits on age but gives no hint of an upper limit. At the time of its writing, septuagenarians were relatively rare, and having survived so long was a sign of heartiness and cautiousness. Now it is the norm. In 2016 the top three presidential candidates turned 69, 70, and 75. By the time of the 2021 inauguration, a President Joe Biden would be 78.

After age 40, the brain decreases in volume by about 5 percent every decade. The most noticeable loss is in the frontal lobes. These control motor functioning of the sort that would direct a hand to a cup and a cup to the mouth in one fluid motion—in most cases without even looking at the cup.

These lobes also control much more important processes, from language to judgment to impulsivity. Everyone experiences at least some degree of cognitive and motor decline over time, and some 8.8 percent of Americans over 65 now have dementia. An annual presidential physical exam at Walter Reed National Military Medical Center is customary, and Trump's is set for January 12. But the utility of a standard physical exam—knowing a president's blood pressure and weight and the like—is meager compared with the value of comprehensive neurologic, psychological, and psychiatric evaluation. These are not part of a standard physical.

Even if they were voluntarily undertaken, there would be no requirement to disclose the results. A president could be actively hallucinating, threatening to launch a nuclear attack based on intelligence he had just obtained from David Bowie, and the medical community could be relegated to speculation from afar.

Even if the country's psychiatrists were to make a unanimous statement regarding the president's mental health, their words may be written off as partisan in today's political environment. With declining support for fact-based discourse and trust in expert assessments, would there be any way of convincing Americans that these doctors weren't simply lying, treasonous "liberals"—globalist snowflakes who got triggered?

The downplaying of a president's compromised neurologic status would not be without precedent. Franklin Delano Roosevelt famously disguised his paralysis from polio to avoid appearing "weak or helpless." He staged public appearances to give the impression that he could walk, leaning on aides and concealing a crutch. Instead of a traditional wheelchair, he used an inconspicuous dining chair with wheels attached. According to the FDR Presidential Library, "The Secret Service was assigned to purposely interfere with anyone who tried to snap a photo of FDR in a 'disabled or weak' state."

Documenting the reality of Roosevelt's health status fell to journalists, who had been reporting on his polio before his first term. A 1931 analysis in Liberty magazine asked "Is Franklin D. Roosevelt Physically Fit to Be President?" and reported on his paralysis: "It is an amazing possibility that the next president of the United States may be a cripple." Once he was elected, Time described the preparation of the White House: "Because of the president-elect's lameness, short ramps will replace steps at the side door of the executive offices leading to the White House."

Today much more can be known about a person's neurological status, though little of it is as observable as hemiplegia. Unfortunately, the public medical record available to assuage global concerns about the current president's neurologic status is the attestation of Harold Bornstein, America's most famous Upper Manhattan gastroenterologist, whose initial doctor's note described the 71-year-old Trump as "the healthiest individual ever elected to the presidency."

The phrasing was so peculiar for a medical record that some suggested that Trump had written or dictated the letter himself. Indeed, as a key indicator of neurologic status, Trump's distinctive diction has not gone without scrutiny. Trump was once a more articulate person who sometimes told stories that had beginnings, middles, and ends, whereas he now leaps from thought to thought. He has come to rely on a small stable of adjectives, often involving superlatives. An improbably high proportion of what he describes is either the greatest or the worst he's ever seen; absolutely terrible or the best; tiny or huge.

The frontal lobes also control speech, and over the years, Donald Trump's fluency has regressed and his vocabulary contracted. In May of last year, the journalist Sharon Begley at Stat analyzed changes in his speech patterns during interviews over the years. She noted that in the 1980s and 1990s, Trump used phrases like "a certain innate intelligence" and "These are the only casinos in the United States that are so rated." I would add, "I think Jesse Jackson has done himself very proud."

He also more frequently finished sentences and thoughts. Here he is with Larry King on CNN in 1987:

King: Should the mayor of the city be someone who knows business?

Trump: Well, what we need is competence. We don't have that. We have a one-line artist. That's all he is ...

Or on Oprah in 1988:

Winfrey: What do you think of this year's presidential race, the way it's shaping up?

Trump: Well, I think it's going to be very interesting. I think that probably George Bush has an advantage, in terms of the election. I think that probably people would say he's got, like, that little edge in terms of the incumbency, etcetera, etcetera. But I think Jesse Jackson has done himself very proud. I think Michael Dukakis has done a hell of a job. And George Bush has done a hell of a job. They all went in there sort of as semi-underdogs—including George Bush—and they've all come out. I think people that are around all three of those candidates can be very proud of the jobs they've done.

Compare that with the meandering, staccato bursts of today. From an interview with the Associated Press:

People want the border wall. My base definitely wants the border wall, my base really wants it—you've been to many of the rallies. Okay, the thing they want more than anything is the wall. My base, which is a big base; I think my base is 45 percent. You know, it's funny. The Democrats, they have a big advantage in the Electoral College. Big, big, big advantage ... The Electoral College is very difficult for a Republican to win, and I will tell you, the people want to see it. They want to see the wall.

Ben Michaelis, a psychologist who analyzes speech as part of cognitive assessments in court cases, told Begley that although some decline in cognitive functioning would be expected, Trump has exhibited a "clear reduction in linguistic sophistication over time" with "simpler word choices and sentence structure."

This is evident even off camera, as in last week's post-golf sit-down with The New York Times at his resort in Florida:

The tax cut will be, the tax bill, prediction, will be far bigger than anyone imagines. Expensing will be perhaps the greatest of all provisions. Where you can do something, you can buy something ... Piece of equipment ... You can do lots of different things, and you can write it off and expense it in one year. That will be one of the great stimuli in history. You watch. That'll be one of the big ... People don't even talk about expensing, what's the word "expensing." [Inaudible.] One-year expensing. Watch the money coming back into the country, it'll be more money than people anticipate. But Michael, I know the details of taxes better than anybody. Better than the greatest CPA. I know the details of health care better than most, better than most. And if I didn't, I couldn't have talked all these people into doing ultimately only to be rejected. Now here's the good news. We've created associations, millions of people are joining associations. Millions. That were formerly in Obamacare or didn't have insurance. Or didn't have health care. Millions of people. That's gonna be a big bill, you watch. It could be as high as 50 percent of the people. You watch. So that's a big thing ...

The paper said that the transcript was "lightly edited for content and clarity."

If Trump's limited and hyperbolic speech were simply a calculated political move—he repeated the phrase "no collusion" 16 times in the Times interview, which some pundits deemed an advertising technique—then we would also expect an occasional glimpse behind the curtain. In addition to repeating simplistic phrases to inundate the collective subconscious with narratives like "no collusion," Trump would give at least a few interviews in which he strung together complex sentences, for example to make a case for why Americans should rest assured that there was no collusion.

Though it is not possible to diagnose a person with dementia based on speech patterns alone, these are the sorts of changes that appear in early stages of Alzheimer's. Trump has likened himself to Ronald Reagan, and the changes in Trump's speech evoke those seen in the late president. Reagan announced his Alzheimer's diagnosis in 1994, but there was evidence of linguistic change over the course of his presidency that experts have argued was indicative of early decline. His grammar worsened, and his sentences were more often incomplete. He came to rely ever more on vague and simple words: indefinite nouns and "low imageability" verbs like have, go, and get.

After Reagan's diagnosis, former President Jimmy Carter sounded an alarm over the lack of a system to detect this sort of cognitive impairment earlier on. "Many people have called to my attention the continuing danger to our nation from the possibility of a U.S. president becoming disabled, particularly by a neurologic illness," Carter wrote in 1994 in the Journal of the American Medical Association. "The great weakness of the Twenty-Fifth Amendment is its provision for determining disability in the event that the president is unable or unwilling to certify to impairment or disability."

Indeed, the 1967 amendment laid out a process for transferring power to the vice president in the event that the president is unable to carry out the duties of the office due to illness. But it generally assumed that the president would be willing to undergo diagnostic testing and be forthcoming about any limitations.

This may not happen with a person who has come to be known for denying any hint of weakness or inability. Nor would it happen if a president had a psychiatric disorder that impaired judgment—especially if it was one defined by grandiosity, obsession with status, and intense aversion to being perceived as weak.

Nor would it happen if the only person to examine the president was someone like Harold Bornstein—whose sense of objective reality is one in which Donald Trump is healthier than the 42-year-old Theodore Roosevelt (who took office after commanding a volunteer cavalry division called the Rough Riders, and who invited people to the White House for sparring sessions, and who after his presidency would sometimes spend months traversing the Brazilian wilderness).

It was for these reasons that in 1994, Carter called for a system that could independently evaluate a president's health and capacity to serve. At many companies, even where no missiles are involved, entry-level jobs require a physical exam. A president, it would follow, should be more rigorously cleared. Carter called on "the medical community" to take leadership in creating an objective, minimally biased process—to "awaken the public and political leaders of our nation to the importance of this problem."

More than two decades later, that has not happened. But questions and concern around Trump's psychiatric status have spurred proposals anew. In December, also in the Journal of the American Medical Association, mental-health professionals proposed a seven-member expert panel "to evaluate presidential fitness." Last April, representative Jamie Raskin introduced a bill that would create an 11-member "presidential capacity" commission.

The real-world application of one of these systems is complicated by the fact that the frontal lobes also control things like judgment, problem-solving, and impulse control. These metrics, which fall under the purview of psychiatrists and clinical psychologists, can be dismissed as opinion. In a hospital or doctor's office, a neurologist may describe a patient with Parkinson's disease as having "impaired impulse control." The National Institute on Aging lists among the symptoms of Alzheimer's "poor judgment leading to bad decisions."

These are phrases that can and do appear in a person's medical record. In the public sphere, however, they're easily dismissed as value judgments motivated by politics. The Harvard law professor Noah Feldman recently accused mental-health professionals who attempt to comment on Trump's cognition of "leveraging their professional knowledge and status to 'assess' his mental health for purposes of political criticism."

Indeed thousands of mental-health professionals have mobilized and signed petitions attesting to Trump's unfitness to hold office. Some believe Trump should carry a label of narcissistic personality disorder, antisocial personality disorder, or both. The largest such petition has more than 68,000 signatures—though there is no vetting of the signatories' credentials. Its author, psychologist John Gartner, told me last year that in his 35 years of practicing and teaching, "This is absolutely the worst case of malignant narcissism I've ever seen."

Many other mental-health professionals are insistent that Trump not be diagnosed from afar by anyone, ever—that the goal of mental-health care is to help people who are suffering themselves from disabling and debilitating illnesses. A personality disorder is "only a disorder when it causes extreme distress, suffering, and impairment," argues Allen Frances, the Duke University psychiatrist who was a leading author of the third edition of the Diagnostic and Statistical Manual, which was the first to include personality disorders.

This is consistent with the long-standing, widely misunderstood rule in the profession that no one should ever be diagnosed outside of the confines of a one-on-one patient-doctor relationship. The mandate is based on a legal dispute that gave rise to the American Psychiatric Association's (APA) "Goldwater Rule," which was implemented after the politician Barry Goldwater sued Fact magazine for libel because a group of mental-health professionals speculated about Goldwater's thought processes in its pages.

The rule has protected psychiatrists both from lawsuits and from claims of subjectivity that threaten trust in the entire enterprise.

After more than a year of considering Trump's behavior through the lens of the cognitive sciences, I don't think that labeling him with a mental illness from afar is wise. A diagnosis like narcissistic personality disorder is too easily played off as a value judgment by an administration that is pushing the narrative that scientists are enemies of the state. Labeling is also counterproductive to the field in that it presents risks to all the people who deal with the stigma of psychiatric diagnoses. To attribute Trump's behavior to mental illness risks devaluing mental illness.

Judiciousness in public statements is only more necessary as the Trump administration plays up the idea of partisan bias in its campaign against "the media." The consistent message is that if someone is saying something about the president that depicts or reflects upon him unfavorably, the statement must be motivated by an allegiance to a party. It must be, in a word, "fake"—coming from a place of spite, or vengeance, or allegiance to some team, creed, or party. Expertise is simply a guise to further a hidden political cause. Senator Lindsey Graham recently told CNN that the media's portrayal of President Donald Trump is "an endless, endless attempt to label the guy as some kind of kook not fit to be president."

(Of course, Graham himself has called Trump a "kook" who is "not fit to be president." That was in 2016, though, during the Republican presidential primary, when the two were not yet allies.)

That sort of breathless indictment—followed by a reversal and condemnation of others for making the same statement—may not be rare among politicians, but it is a leap to assume that doctors and scientists would similarly lie and abandon their professional ethics out of allegiance to a political party. When judgment is compromised with bias, it tends to be more subtle, often unconscious. Bias will color any assessment to some degree, but it needn't render science useless in assessing presidential capacity.

The idea that the president should not be diagnosed from afar only underscores the point that the president needs to be evaluated up close.

A presidential-fitness committee—of the sort that Carter and others propose, consisting of nonpartisan medical and psychological experts—could exist in a capacity similar to the Congressional Budget Office. It could regularly assess the president's neurologic status and give a battery of cognitive tests to assess judgment, recall, decision-making, attention—the sorts of tests that might help a school system assess whether a child is suited to a particular grade level or classroom—and make the results available.

Such a panel need not have the power to unseat a president, to undo a democratic election, no matter the severity of illness. Even if every member deemed a president so impaired as to be unfit to execute the duties of the office, the role of the committee would end with the issuing of that statement. Acting on that information—or ignoring or disparaging it—would be up to the people and their elected officials.

Of course, the calculations of the Congressional Budget Office can be politicized and ignored—and they recently have been. Almost every Republican legislator voted for health-care bills this year that would have increased the number of uninsured Americans by 20-some million, and they passed a tax bill that will add $1.4 trillion to the federal deficit. A majority of Americans did not support the bill—in part because a nonpartisan source of information like the CBO exists to conduct such analyses.

That math and polling can be ignored or disputed, or the CBO can be attacked as a secretly subversive entity, but at least some attempt at a transparent analysis is made. The same cannot be said of the president's cognitive processes. We are left only with the shouts of experts from the sidelines, demeaning the profession and the presidency.

Articles and Stories We Do Not Want to Read or Edit

Posted: 03 Jan 2018 02:14 PM PST

Before memes, before the internet, there were just regular old cliches—text and imagery recycled and adapted across media. All cliches are memes, really, though not all memes are cliches. (Until they are, and then they die, the theory goes.)  

Who can remember pre-internet civilization, anyway? It's enough of a stretch to recall what things were like before smartphones anymore, let alone life before dial-up. Revolutions in how information travels are the big ones, upending all kinds of habits and norms, quickly and irreversibly.

"Any revolution in the means of communication is apt to become the cause, if it is not the effect, of a general revolution in technology," the philosopher of history Arnold J. Toynbee wrote in this magazine in 1953. "And a general revolution in technology is bound to bring with it a change in the scale of economic, and therefore of military and political, operations."

Toynbee died in 1975, the year the ARPANET—the technical foundation of the modern-day internet—became operational. He may have thought deeply about what technology did to communications systems and infrastructure in and up to the 20th century, but he never could have anticipated how it would change the way we live and work today. Back in Toynbee's day, for example, one newspaper reporter had described the not-exactly-frenetic pace of American magazine journalism this way: "Deadline pressures have been relative on The Atlantic. Ranking editors are encouraged to wrench themselves away from the pressures of the telephones, and of secretaries with appointment books in hand, in order to travel, go fishing, meditate, or whatever fits the mood."

This is hilarious in 2018 for a few reasons—not least of all because it challenges the idea that only a Millennial would miss work to meditate. But less has changed over the decades than is sometimes assumed. Consider for example, this jokey New Year's memo from The Atlantic's former editor in chief, Robert Manning, published in this magazine in January 1973. The subject: articles and stories we do not want to read or edit in 1973.

Adrienne LaFrance / The Atlantic

More than four decades later, it's wonderfully (almost eerily) timely—a few very-1970s references to LSD, peyote, and Richard Nixon notwithstanding. Consider, for example, The Atlantic's sense that the following stories had been overdone way back when:

• Next stop Mars
• The insolence of the young
• Does New York City have a future?
• The coming-of-age of television
• The impending demise of college football

Throw in a couple more modern cliches—something about driverless cars and the trolley problem, say—and Manning's list could just as well be used today. (Also, if we're adding to it, please no more Raymond Carver–inspired headlines, and ban all references to Mr. Smith going to Washington unless you are literally writing about the Capra film.) My personal favorite from Manning's list, though, is the relevant-as-ever warning against "reports by journalists too subjective to question, listen to, or observe the people actually involved in the events described." Or, in today's parlance: #NeverTweet.

Those ‘Alien Megastructures’ Are Probably Just Dust

Posted: 03 Jan 2018 09:08 AM PST

In 2015, The Atlantic first reported that astronomers had discovered some tantalizing information about a distant star in the Milky Way, located about 1,300 light-years from Earth in a swan-shaped constellation called Cygnus. The star itself, slightly bigger than our sun, seemed pretty ordinary as far as stars in the universe go. But every now and then, the light of the star appeared to dim and brighten.

This wasn't the weird part. Astronomers look for faint dips in brightness in their search for exoplanets around other stars all the time. The dimming means that something is passing in front of a star and blocking some light from reaching Earth. Telescope observations have discovered thousands of exoplanets in this way.

The weird part about this star was the behavior of those light fluctuations. The flickering seemed almost random. Some dips in light lasted a few hours, while others lasted for days or weeks. The light dimmed by 1 percent at some times, a change that would typically suggest the presence of a Jupiter-sized exoplanet around the star. But at other times, the light would dim by more than 20 percent, a drop that suggested something much more massive was passing by.

The star's sporadic dimming stumped astronomers, who dubbed it "the most mysterious star in the universe." They proposed several natural theories, like a transiting comet. When none seemed to fit the bill, they started considering something else. Could the object passing in front of this star, blocking out the light, be a swarm of alien megastructures built by an advanced civilization?

It was an exciting explanation, one that harkened back to a kind of technology imagined by the science-fiction writer Olaf Stapledon, and later popularized by the physicist Freeman Dyson, that would allow intelligent extraterrestrial life to harness the energy of their star.

It's also probably the wrong one.

The mysterious flickering around the star in the swan constellation is likely caused by ordinary dust, according to an analysis of new data published Wednesday in The Astrophysical Journal Letters. Astronomers have found that the mystery dimming is much deeper at blue wavelengths than at red wavelengths, which means the object responsible for it is not opaque. The data suggest that the dimming is likely caused by a cloud of very small particles of cosmic dust, which measure less than one micron each, smaller than the size of a human red blood cell.

"If you imagine you have some light source and a planet—which is an opaque object—goes in front of it, it will block blue light just as much as it would the red light," said Tabetha Boyajian, an astronomer at Louisiana State University who led the analysis. "What we're seeing for this star is that the drop in the star's brightness is much greater in the blue than it is in the red."

If the dimming had occurred in all colors equally, megastructures would still be on the table. "The fact that the data came in the other way means that we now have no reason to think alien megastructures have anything to do with the dips of Tabby's Star," wrote Jason Wright, the Pennsylvania State University astronomer who first suggested the alien megastructure theory, in a blog post Wednesday. (Astronomers have nicknamed the star Tabby's Star, after Boyajian.)

Searches for radio signals coming from the star have also turned up empty, providing another blow to the alien theory.

The strange flickering from Tabby's Star was first detected by the Kepler Space Telescope, an exoplanet-hunting mission that started looking for changes in the brightness of stars in 2009. The mission produced an enormous amount of data. Boyajian made it public through a program called Planet Hunters and asked volunteers to comb through it, looking for patterns that would be difficult for fast-moving algorithms to spot. In 2011, citizen scientists flagged one star. Of the 150,000 stars Kepler had observed, this was the only one that exhibited this strange behavior.

After the news of Tabby's Star was made public in 2015, other astronomers started digging into its past. They found another kind of dimming phenomenon, one that spanned years. In 2016, astronomers said their examination of old photographic plates from as early as 1890 revealed the star has been gradually decreasing in brightness for more than century. Later that year, astronomers revisited Kepler data and found that the star actually dimmed slightly every year by about 0.34 percent. Tabby's Star kept getting weirder and weirder. None of the data fit any one explanation nicely, whether it was a swarm of comets or alien megastructures.

Cosmic dust started looking like one of the best culprits in October 2017. Astronomers analyzed the light from Tabby's star over time and found more dimming in blue light than in red light—the same effect Boyajian measured during the real-time observations of the dips. Boyajian and other astronomers said Tabby's Star was probably surrounded by clouds of circumstellar dust, grains that orbit around a star and are slightly bigger than interstellar dust. Circumstellar is just big enough to stick inside the star's orbit, but too small to block light in all wavelengths.

The latest analysis comes from data collected between March 2016 and December 2017. Boyajian and her team observed the star with the Las Cumbres Observatory, an effort largely funded by a Kickstarter that raised more than $100,000. In May, the star started to dim. Boyajian tweeted about it, sending astronomers around the world into a frenzy as they raced to point other telescopes at Tabby's star. It was the first time scientists had witnessed one of the star's mysterious dips in real time. A second dip was recorded in June, a third in August, and a fourth in September. The dimming varied between 1 percent and 2.5 percent, and lasted between several days and several weeks.

While the latest observations seem to rule out the possibility of alien megastructures, they don't completely solve the mystery of Tabby's star. Boyajian and her team had expected to detect an excess of infrared light, created when starlight hits surrounding dust, but they saw none. "That was really surprising," Boyajian said. "It's kind of telling us, okay, this is not going to be easy." There are several possible explanations, including that the dust may be too far from the star to become bright enough to glow in infrared.

Boyajian said more papers will be coming in the next few months on the rest of last year's observations.  "There's still a possibility that we don't really have a theory that's correct yet," she said. The cosmic-dust theory provides an answer to one of the questions posed by Tabby's Star, but others remain.

Their answers may not even be something we would recognize if we saw it. As Wright and his colleague Kimberly Cartier wrote in an article last year, "Whatever is responsible may lie outside the realm of known astronomical phenomena."

The Shadow Over <i>Call Me by Your Name</i>

Posted: 03 Jan 2018 07:21 AM PST

The masterful shot that closes Call Me by Your Name asks the viewer to do the same thing the character on screen is doing: think. Over seven minutes, Elio Perlman, the 17-year-old played by Timothée Chalamet, simply stares into a crackling fireplace as tears well in his eyes. He presumably is reflecting on his tryst with Oliver, Armie Hammer's 24-year-old grad student who visited Elio's Italian home for the summer. And on Elio's own father's life in the closet, revealed to him toward the end of the film. And maybe on his future, perched as he is on the cusp of adulthood, and having just had an affair that felt life-changing.

The audience should be reflecting on those things, too. It's possible, though, they'd be considering something surely not on Elio's mind: AIDS. At least, that was the case for me—a fact that has gotten me into arguments with friends who are, understandably, wary of over-reading a film devoted to young love's bittersweetness and the glory of short shorts.

The acclaim for Luca Guadagnino's adaptation of André Aciman's novel has, overwhelmingly, focused on its cinematic loveliness and emotional power. As Guadagnino's camera inhabits the gaze of a young man whose fantasy becomes reality, it refreshingly depicts "a story of queer love that isn't tinged with horror or tragedy," as my colleague David Sims wrote. The flip side is that Call Me by Your Name's prettiness has come in for rebuke, too, with some critics faulting it for trying too much to appeal to a "universal" audience, and others asking why it has won so much more attention than more provocative, political queer stories.

But I'd argue there actually is a tinge of tragedy to Call Me by Your Name, and part of the richness of the movie is in the way it makes a larger point while mostly keeping politics off screen. The story does feel sealed, its characters happily isolated in a landscape of ripe fruit and ancient ruins that almost feels pre-electricity. Yet on the edges of the film are reminders of the broader social struggle that Elio and Oliver feel temporarily exempted from—and maybe, just maybe, of the epidemic that queer men were beginning to contend with.

Oliver and Elio's archeologist dad read into the surfaces of the artifacts they unearth—"there's not a straight line in any of these statues; they're all curved, as if daring you to desire them," Mr. Perlman says. The viewer should bring the same scrutiny to Guadagnino's surfaces. Why, for example, are there so many flies in the movie? Elio swats bugs away repeatedly, and faint buzzing often joins the idyllic soundscape. Flies are especially noticeable in the scene of Oliver and Elio's first kiss, as well as in the final shot before the fireplace.

The tale unfolds in rural Italian summer, redolent with natural rot—fair enough. But surely there's a reason Guadagnino draws attention to that rot. At Slate, Eleanor Cummins speculates that the insects, which have short lifespans, symbolize the temporary nature of Elio and Oliver's affair. Maybe so. But flies can obviously connote human death and illness, too.

The same can be said for blood, such as the blood that suddenly, inexplicably spills from Elio's nose at dinner. Or such as the blood crusted on a nasty gash on Oliver's hip. When he first shows his wound to Elio, it's a sensual tease—though a gory one. Later, right after their first make-out, Oliver points to the injury again, this time to kill the mood. "I think it's starting to get infected," he says. These touches—pungent, corporal—fit with a story about physical desire. But they also inject a note of queasiness, raising the thought of the body's fragility.

Maybe the horror-film flashes are meant simply to reinforce the fear Oliver and Elio must feel. Their relationship is forbidden, we sense, because of their age difference, because Elio is the son of the Oliver's boss, and because they are the same sex. Though none of these factors is spoken of directly, both characters clearly feel a dalliance would be taboo. Elio at one point makes a homophobic crack about his parents' gay friends. And despite his brash, swaggering affect, Oliver comes off as especially worried about the external world's judgment. "We haven't done anything to be ashamed of, and that's a good thing," he tells Elio after breaking off their first kiss. "I want to be good."

The miraculous nature of the story stems not only from Elio and Oliver overcoming their fears, but also from the way the obstacles they face simply vanish—because, we later learn, those obstacles were illusory for them. In the monologue Elio's father gives toward the end of the film, forbidden love is made okay, even encouraged. More than that, Mr. Perlman's confession—that he has wanted but never had the kind of relationship his son has enjoyed—marks the moment when Call Me by Your Name telescopes out. An intimate, specific story must be considered against the larger circumstances that queer people faced. In that context, it becomes a tale, more broadly, of liberation—and perhaps its limits.

When Oliver calls the Perlman household to announce he's engaged to a woman, it reads as a capitulation for the outwardly swashbuckling American who pursued Elio and hid the fact that he had a girl back at home. Outside of the permissive paradise of the Italian summer, we're reminded, there are rules. But Elio may have escaped to a freer future than his lover could access, one less constrained by shame and repression. "You're so lucky," the older man tells the younger one over the phone. "My father would have carted me off to a correctional facility." Even so, Elio is shaken by Oliver's call.

Note the aesthetics of the final scenes. The world is frozen over outside the Perlman house, but inside there is fire and food. The t-shirts he wore in summer have been replaced not only by warmer clothes, but also by more bold, even flamboyant, ones. The pattern on his billowy, tucked-in shirt shows a crowd of androgynous faces. As Elio cries by the fire, a fly crawls across those faces.

The shirt's design is so reminiscent of '80s urban life that, whether they're meant to or not, viewers might start to think of the artist Keith Haring, whose work came to be associated with the fight against HIV/AIDS. Or they may simply think of what that decade meant for queer men, both the closeted ones like Oliver and the growing class of liberated ones like Elio. The book version of Call Me by Your Name was set in 1987, but Guadagnino moved the story to 1983 because, he has said, "'83 is the year—in Italy at least—where the '70s are killed, when everything that was great about the '70s is definitely shut down." Part of that shut-down, any cultural history will attest, is that the sexual awakening of the '60s, which fed the libertine '70s, smacked into a hard, deadly reality: AIDS.

I'm not suggesting that the movie telegraphs Elio's future as one of sickness (Guadagnino has talked about filming sequels that follow these characters years later, Before Sunset–style, and the book closes with a series of flash-forwards). The critic Eric Eidelstein persuasively argues that the film's flies and blood could be red herrings, subverting the cliché of the ill-fated gay romance. But the flip side of that subversion is an understanding that prejudice is not the only reason gay people have, so often, been saddled with tragic stories in pop culture. It is an understanding that the year's other splashy European queer film, 120 Beats Per Minute, about AIDS activism in Paris in the early '90s, need not be seen as a foil to Call Me by Your Name but as a companion piece. Self-actualization—or simply loving as one wants—was not the entire struggle.

The queer utopia Elio and Oliver built is poignantly temporary and limited—both for reasons that the movie spells out, and conceivably for historical reasons that go unmentioned but perhaps not unconsidered. In his sermon, Mr. Perlman invites his son to live his truth, but emphasizes that doing so inevitably means opening oneself up to pain. He also makes a statement that's queer in the sense of holding opposed meanings, happy and sad. "When you least expect it," he says, "Nature has cunning ways of finding our weakest spot."

Trump's Bellicosity Is Ceding America's Influence to China

Posted: 03 Jan 2018 11:08 AM PST

Leave to the psychoanalysts the question why Korea seems to provoke President Trump to more reckless comments than any other international problem. What the world must live with are the consequences.

Again and again since Inauguration Day, Donald Trump has said and tweeted provocative denunciations not only of North Korea, but also of America's supposed ally, South Korea.

In April 2017, on the eve of South Korean presidential elections, the president gave an interview to Reuters that punched two sensitive points. He threatened to rip up the U.S.-Korea Free Trade Agreement. "It is unacceptable, it is a horrible deal made by Hillary," he said. "It's a horrible deal, and we are going to renegotiate that deal or terminate it." In that same interview, Trump demanded a billion-dollar payment for a high-altitude missile defense system. That demand reneged on an agreement reached by Trump's own administration, by which the South Koreans provided the land for the system and the United States provided the weapons. It probably will not surprise you to learn that the free-trade agreement was not, in fact, negotiated by Hillary Clinton. Most of the work was done under President George W. Bush. The agreement then stalled in Congress after the Democratic victories of 2006, until President Obama's trade negotiators revised it to provide more advantages for U.S. automakers. Accurate or not, Trump's comments sent South Korean stock and currency markets into a tumble.

Trump's pique with South Korea might be explained by an embarrassment he had suffered in the country two weeks earlier. Apparently misunderstanding a Pentagon briefing, Trump had boasted in an April 12 Fox Business interview that he was personally and immediately sending a "very powerful" "armada" into Korean waters to menace North Korea. That armada—the aircraft carrier USS Carl Vinson and support vessels—was then photographed thousands of miles away heading in the opposite direction, passing between the Indonesian islands of Java and Sumatra en route toward India. Trump's mistake was criticized by South Korean politicians and mocked in the South Korean media. The Reuters interview may have been payback.

That interview had the unintended effect of helping to boost the more U.S.-skeptical of the South Korean presidential candidates in the May 9 election. In midsummer, speaking at his New Jersey golf retreat without a single South Korean present, Trump promised to visit "fire and fury like the world has never seen" upon North Korea. In September at the United Nations he warned that he might "totally destroy North Korea," adding "Rocket Man is on a suicide mission for himself and his region."

As a candidate for president, South Korea's Moon Jae In had opposed the deployment of missile defenses, urging negotiation with the North instead. Now as president, this conciliation-minded leader—already inclined toward skepticism of the United States—daily confronts a new strategic reality: His country's most important security partner seems determined to confirm every negative attitude about the U.S. held by nationalist South Koreans. The Moon government has responded with a flurry of overtures toward the North.

Together, Kim Jong Un and Donald Trump are enabling the North Korean nuclear program to evolve into a mighty diplomatic weapon against U.S. interests, separating South Korea from the United States, incentivizing the South to placate the North. Together, Kim and Trump are depriving the U.S. of conventional military options—because there is no non-nuclear option against the North without the support of the South. Between 2015 and 2017, South Korean confidence in the United States to do the right thing in international affairs has dropped by a startling 71 points in a Pew survey. Only 17 percent of South Koreans have confidence in Donald Trump—less than half the number that trust China's Xi Jinping.

And who is Xi's best publicist? Why, Donald Trump himself. Trump has often told the world that it is China, not the United States, that has the most leverage over North Korea. He tweeted in 2013, "North Korea is reliant on China. China could solve this problem easily if they wanted to but they have no respect for our leaders." And as president too, he has looked to China first and foremost to sway North Korea. He tweeted in July 2017: "Perhaps China will put a heavy move on North Korea and end this nonsense once and for all!"

The thought is bound to occur to South Koreans increasingly wary of Trump's protectionism, unpredictability, and bellicosity: If indeed it is China that can control the North, maybe it is to China not the United States that South Koreans should look for security?

In a May 30 op-ed, White House senior advisers Gary Cohn and H.R. McMaster sought to assure the world that "America First" does not mean "America alone." In the Korean peninsula, however, increasingly that's just what Trump has wrought. Trump's warlike boasting is steadily leading the United States toward the starkest and most extreme dilemma: The only policies remaining will be a unilateral nuclear strike upon the North—or humbly submitting to a new Chinese-led security order in Northeast Asia.  

The Terrifying Truth of Trump's 'Nuclear Button' Tweet

Posted: 03 Jan 2018 08:24 AM PST

When the American president tweeted on Tuesday evening that his "Nuclear Button" is "bigger & more powerful" than the North Korean leader's, and that "my Button works!" unlike the desktop button that Kim Jong Un had just threatened the United States with in a New Year's speech, Twitter naturally exploded with angst.

People wondered whether they were hallucinating, whether their final moments would involve "reading Twitter hot takes as nukes rain down." "Folks ... are freaking out about the mental instability of a man who can kill millions without permission from anybody," one former Obama administration official wrote. Setting aside the technicalities of Donald Trump's boast (he has a briefcase, not a button), the commander in chief was casually sounding off on social media about war with the world's deadliest weapons, apparently after watching Fox News. He was daring Kim to prove that his "nuclear button" works by, for example, testing a missile with a live nuclear weapon over the Pacific Ocean—the kind of scenario that the Republican Senator and Trump confidant Lindsey Graham recently told me would dramatically increase the chances of a U.S. attack on North Korea.

But lurking behind the freakout was a profoundly uncomfortable fact: Trump was stating, in the crudest possible form, what U.S. officials have said for decades. Kim Jong Un had argued that his capability to hit the United States with nuclear weapons would dissuade the U.S. from waging war against North Korea. And Donald Trump seemed to be reminding Kim that he best not consider a nuclear strike—since America's nuclear-weapons arsenal is superior to North Korea's and America isn't afraid to use it. This was nuclear deterrence, in 280 Trumpian characters. Stripped of the usual abstraction and euphemism, it was terrifying to behold.

In 1958, the U.S. military strategist Bernard Brodie didn't taunt the rising nuclear power at the time, Russia, by tweeting "my Button works!" But he did write that deterrence in the Atomic Age operated on a "sliding scale" in which any functional nuclear weapon provided considerable deterrence and the "maximum possible deterrence" required "'decisive superiority' over the enemy." When the Cold War ended, a Defense Department committee didn't recommend that America's deterrence policy be "I too have a Nuclear Button." But it did declare that the "essential sense of fear is the working force of deterrence" and that the United States should convey to adversaries in ambiguous terms that it "may become irrational and vindictive if its vital interests are attacked." It praised Bill Clinton for informing the North Koreans that if they ever used nuclear weapons, "it would be the end of their country." Be it John F. Kennedy or Ronald Reagan or Barack Obama, American presidents have spoken passionately not about how "big & powerful" U.S. nuclear weapons are, but about the need, as Kennedy put it, to abolish these weapons of war "before they abolish us." But every American president since Harry Truman, while engaging in efforts to restrict and scale back nuclear proliferation, "has sought to maintain, in the words of John F. Kennedy, a nuclear-weapons capability 'second to none,'" as the former arms-control official Robert Joseph has noted. There's a long, bipartisan tradition of "My nuclear button is bigger and better than yours"—or, at least, as big and as good as yours.

"Any threat to the United States, or its territories … or our allies will be met with a massive military response—a response both effective and overwhelming. … We are not looking to the total annihilation of a country—namely, North Korea. But, as I said, we have many options to do so," Defense Secretary James Mattis tells us, not on Twitter but before cameras at the White House. "There is no substitute for the prospect of a devastating nuclear response" from America's nuclear-capable bombers, intercontinental ballistic missiles, and submarine-launched missiles to deter a nuclear first strike by U.S. adversaries, says Air Force General Paul Selva. This nuclear triad, he contends, must be kept up-to-date since "deterrence really is no different in the 21st century than it was in the 20th century or the 19th century or the 1st century B.C.—'You hurt me, I'm going to hurt you worse. I have the tools to do it, and if you don't believe me, then step over the line.'" The Pentagon recently created an entire microsite devoted to these themes. In one video on the site, John Hyten, the top U.S. nuclear commander, asserts that "strategic" deterrence "starts with the nuclear capabilities" and that "our nuclear forces have to be ready all the time to provide that initial deterrent capability," as forbidding images of missiles, submarines, aircraft, and launch switches flash across the screen. If Trump had tweeted "I too have a Strategic Deterrent and our Nuclear forces are always ready to provide that initial Deterrent capability!"—if he had cloaked his warning in conventional presidentialese—he might have sowed less panic on Twitter. But the substance of his message wouldn't have changed much.

Even Trump's reference to the mythical nuclear button—to the U.S. president's largely untrammeled authority to order the use of nuclear weapons—has roots in deterrence theory. As the historian Alex Wellerstein has written, "While nuclear launch officers are not meant to be strictly mechanical (and indeed, the United States has always resisted fully automating the process), if they stopped to question whether their authenticated orders were legitimate, they would put the credibility of U.S. nuclear deterrence at risk." Trump's singularly coarse, aggressive, and unpredictable approach to conducting foreign policy—and particularly to countering North Korea's nuclear program—has moved many Americans to think about the unthinkable and look U.S. nuclear policy squarely in the face. This fall, for instance, a congressional foreign-affairs committee held the first hearing in 41 years on presidential power over nuclear weapons. "Donald Trump can launch nuclear codes just as easily as he can use his Twitter account," observed Democrat Ed Markey, who has introduced legislation to prevent the president from authorizing the first nuclear strike in a conflict without a congressional declaration of war. "No one human being should ever have that power."

Beatrice Fihn, a campaigner to eliminate the world's nuclear weapons, said something similar shortly before accepting last year's Nobel Peace Prize. "If you're uncomfortable with nuclear weapons under Donald Trump, you're probably uncomfortable with nuclear weapons, because it means you recognize that [deterrence] won't always hold up and things can go wrong," she told me. "Once you start thinking 'this person is appropriate for this weapon but not that person,' then maybe it's the weapon that's the problem."

How to Take a Picture of a Stealth Bomber Over the Rose Bowl

Posted: 03 Jan 2018 05:00 AM PST

The first thought that comes to mind staring at the photograph above is: This has got to be fake. The B-2 stealth bomber looks practically pasted onto the field. The flag is unfurled just so. The angle feels almost impossible, shot directly down from above.

And yet, it's real, the product of lots of planning, some tricky flying, and the luck of the moment. The photographer, Mark Holtzman, has been flying his Cessna 206 around taking aerial images for years, since before the digital-photography days, and he's developed his technique for just this sort of shot.

"The plane is my tripod, and it is a moving tripod," he told me. In fact, the way he took this photograph was literally half-hanging out the window of his plane, his Canon 5D Mark III fitted with a 70–200 mm lens, working the rudder pedals on his craft to put himself in position to fly right over the bomber, as it approached at 200 miles per hour from the opposite direction.

As a dedicated amateur photographer, I spoke with him about the sheer improbability of this photograph, the nerdy technical details, and how you get the authorities to let you fly your Cessna over a B-2.


Alexis Madrigal: First, how do you get cleared to be in that airspace? Is it restricted?

Mark Holtzman: Most of the time, they have a TFR, temporary flight restrictions. Above that, I can fly. But I'm always talking with them. It's run under the Pasadena Police, so I get a clearance. They don't want anybody just flying around during a big event like that, even though you theoretically can. So I was on a discreet frequency, the same frequency as the B-2, talking to them. They know me now.

Madrigal: How high are the military jets flying?

Holtzman: Minimum altitude anybody can be is about 1,000 feet. So, they are roughly 1,000 feet above the people. I was about 2,500 feet above them.

Madrigal: Were you using a pretty huge lens?

Holtzman: Well, the lens I had out was a 70–200 mm lens, but I was really at the 70 mark on it because my goal was to catch the whole stadium.

Madrigal: So that's the picture as you took it right out of the camera, or did you have to crop it?

Hotlzman: I always crop it a little. I had to rotate it a little. In the uncropped version, I had the whole stadium, plus some of the parking lot. Unlike film, the way you shoot digital is you shoot wider and crop it in. It's hard. Things are happening really quick. It's very fluid. I'm flying at 100 miles per hour. They are flying 200 miles an hour in the other [direction]. So, that's 300 miles per hour. Things happen really quickly.

The uncropped image of the B-2 flyover (Mark Holtzman)

Madrigal: How fast are you shooting? What's the shutter speed? (A typical indoor iPhone photo might be exposed for one-tenth to one-30th of a second.)

Holtzman: I'm always over 1,000 [that's one one-thousandth of a second—or very fast]. It's always safer to be there when you're flying.

Madrigal: Can you give me a little more on the logistics of catching the moment?

Holtzman: First you have to figure out what you want to show. For me, my goal was to put the B-2 inside the stadium, preferably in the grass. And I don't want to block any of the names or other stuff. For this picture, if you block the flag, it takes away from it.

So, first you're trying to find the B-2 as it is flying toward you. Everything is fluid. I am moving around. They have to be on their target and you have to be on yours. There are no shortcuts. Sometimes it works and sometimes it doesn't.

Madrigal: One thing that makes this image so spectacular is that it feels like you shot it straight, straight down onto the bomber. It's such an unusual view.

Holtzman: It was pretty much angled straight down. But to be honest, I was just going for the picture. I had my son with me in the backseat so I could make sure that one of us saw the plane coming in. Because things are happening and sometimes you don't focus on them quick enough. But once I'm on shooting, I'm kicking the rudders around to try to put the plane wherever. My feet are always on the rudders and I'm always moving.

Madrigal: It's like you said, your plane is your tripod.

Holtzman: I'm moving the plane around and sticking my head out the window. And then I'm moving the plane, kicking it around left and right to get what I want. It's not like I'm in a blimp right above it waiting for it to fly.

Madrigal: When I first saw it, I thought maybe you'd mounted the camera to the bottom of the plane.

Holtzman: I do have a camera hold in the bottom, but I almost never use it. I need to keep a literal eye out for this thing and then watch it through my lens, then kick the plane anyway that I can. It's a handful. It's the challenge. Where he ended up, I started rolling it over to get [the plane] inside [the stadium], so I could get a vertical picture. I hate to say the word, but I am totally oblivious to where my plane is aiming.

Madrigal: Are you literally hanging out the window?

Holtzman: I try to stay inside because I don't need the wind buffeting me. But I am oblivious to everything else except getting that picture. I get into a zone.

Madrigal: That's when photography becomes more of an athletic pursuit.

Holtzman: It's like the photographers in combat when they are overseas. They can get themselves in harm's way because they are looking through a camera. You do get that way up there. But there is a lot of thought going into it before and we've been pretty successful. When I saw the flag go open, my eyes lit up. I knew I had a nice picture, if I could get this guy in the frame. But it's luck. It's just timing.

Madrigal: What do you think it costs you to do one of these flights?

Holtzman: Just the gas alone is $100 per hour.

Madrigal: What's your background? Military aviation or just learned to fly? And how'd the photography get in there?

Holtzman: My background, I grew up taking pictures. My dad took pictures. We had a darkroom. Then I got my pilot's license at 17. In Burbank [California]. My background's in music. That's what I've always done. Then I worked in a family business, as a lifeguard, as a paramedic. I wasn't going anywhere with the family business, so one day I just decided to become an aerial photographer. I'd never met an aerial photographer. I didn't know they existed.

Madrigal: This was a while ago?

Holtzman: This was back in the days before digital. There I was taking a camera up. No GPS, using a Thomas guide to find where I was going. You find the site, hopefully, sometimes you didn't. Then you'd bring your pictures in and get it developed and then a day later, you see what you got. In retrospect, I can't believe I did that. Then, when digital came along and GPS came along, it was instant gratification and you could find things easily.

Madrigal: Have you dabbled in drones?

Holtzman: No. It's just not what I do. I don't want to be a wedding photographer either. I'm not against it. The low-altitude videos are wonderful. Unfortunately, in California, in Los Angeles, everyone thinks they are a cinematographer, so a lot of people are flying around and a lot of it is illegal and can be dangerous. My son calls them just a different tool, but for this kind of stuff you couldn't use them to do it.

Why Do We Need to Sleep?

Posted: 03 Jan 2018 01:23 PM PST

TSUKUBA, JapanOutside the International Institute for Integrative Sleep Medicine, the heavy fragrance of sweet Osmanthus trees fills the air, and big golden spiders string their webs among the bushes. Two men in hard hats next to the main doors mutter quietly as they measure a space and apply adhesive to the slate-colored wall. The building is so new that they are still putting up the signs.

The institute is five years old, its building still younger, but already it has attracted some 120 researchers from fields as diverse as pulmonology and chemistry and countries ranging from Switzerland to China. An hour north of Tokyo at the University of Tsukuba, with funding from the Japanese government and other sources, the institute's director, Masashi Yanagisawa, has created a place to study the basic biology of sleep, rather than, as is more common, the causes and treatment of sleep problems in people. Full of rooms of gleaming equipment, quiet chambers where mice slumber, and a series of airy work spaces united by a spiraling staircase, it's a place where tremendous resources are focused on the question of why, exactly, living things sleep.

Ask researchers this question, and listen as, like clockwork, a sense of awe and frustration creeps into their voices. In a way, it's startling how universal sleep is: In the midst of the hurried scramble for survival, across eons of bloodshed and death and flight, uncountable millions of living things have laid themselves down for a nice, long bout of unconsciousness. This hardly seems conducive to living to fight another day. "It's crazy, but there you are," says Tarja Porkka-Heiskanen of the University of Helsinki, a leading sleep biologist. That such a risky habit is so common, and so persistent, suggests that whatever is happening is of the utmost importance. Whatever sleep gives to the sleeper is worth tempting death over and over again, for a lifetime.

The precise benefits of sleep are still mysterious, and for many biologists, the unknowns are transfixing. One rainy evening in Tsukuba, a group of institute scientists gathered at an izakaya bar manage to hold off only half an hour before sleep is once again the focus of their conversation. Even simple jellyfish have to rest longer after being forced to stay up, one researcher marvels, referring to a new paper where the little creatures were nudged repeatedly with jets of water to keep them from drifting off. And the work on pigeons—have you read the work on pigeons? another asks. There is something fascinating going on there, the researchers agree. On the table, dishes of vegetable and seafood tempura sit cooling, forgotten in the face of these enigmas.

In particular, this need to make up lost sleep, which has been seen not just in jellyfish and humans but all across the animal kingdom, is one of the handholds researchers are using to try to get a grip on the bigger problem of sleep. Why we feel the need for sleep is seen by many as key to understanding what it gives us.

Biologists call this need "sleep pressure": Stay up too late, build up sleep pressure. Feeling drowsy in the evenings? Of course you are—by being awake all day, you've been generating sleep pressure! But like "dark matter," this is a name for something whose nature we do not yet understand. The more time you spend thinking about sleep pressure, the more it seems like a riddle game out of Tolkien: What builds up over the course of wakefulness, and disperses during sleep? Is it a timer? A molecule that accrues every day and needs to be flushed away? What is this metaphorical tally of hours, locked in some chamber of the brain, waiting to be wiped clean every night?

In other words, asks Yanagisawa, as he reflects in his spare, sunlit office at the institute, "What is the physical substrate of sleepiness?"

Biological research into sleep pressure began more than a century ago. In some of the most famous experiments, a French scientist kept dogs awake for more than 10 days. Then, he siphoned fluid from the animals' brains, and injected it into the brains of normal, well-rested canines, which promptly fell asleep. There was something in the fluid, accumulating during sleep deprivation, that made the dogs go under. The hunt was on for this ingredient—Morpheus's little helper, the finger on the light switch. Surely, the identity of this hypnotoxin, as the French researcher called it, would reveal why animals grow drowsy.

In the first half of the 20th century, other researchers began to tape electrodes to the scalps of human subjects, trying to peer within the skull at the sleeping brain. Using electroencephalographs, or EEGs, they discovered that, far from being turned off, the brain has a clear routine during the night's sleep. As the eyes close and breathing deepens, the tense, furious scribble of the waking mind on the EEG shifts, morphing into the curiously long, loping waves of early sleep. About 35 to 40 minutes in, the metabolism has slowed, the breathing is even, and the sleeper is no longer easy to wake. Then, after a certain amount of time has passed, the brain seems to flip a switch and the waves grow small and tight again: This is rapid eye movement, or REM, sleep, when we dream. (One of the first researchers to study REM found that by watching the movements of the eyes beneath the lids, he could predict when infants would wake—a party trick that fascinated their mothers.) Humans repeat this cycle over and over, finally waking at the end of a bout of REM, minds full of fish with wings and songs whose tunes they can't remember.

Sleep pressure changes these brain waves. The more sleep-deprived the subject, the bigger the waves during slow-wave sleep, before REM. This phenomenon has been observed in about as many creatures as have been fitted with electrodes and kept awake past their bedtimes, including birds, seals, cats, hamsters, and dolphins.

Natalie Andrewson

If you needed more proof that sleep, with its peculiar many-staged structure and tendency to fill your mind with nonsense, isn't some passive, energy-saving state, consider that golden hamsters have been observed waking up from bouts of hibernation—in order to nap. Whatever they're getting from sleep, it's not available to them while they're hibernating. Even though they have slowed down nearly every process in their body, sleep pressure still builds up. "What I want to know is, what about this brain activity is so important?" says Kasper Vogt, one of the researchers gathered at the new institute at Tsukuba. He gestures at his screen, showing data on the firing of neurons in sleeping mice. "What is so important that you risk being eaten, not eating yourself, procreation ... you give all that up, for this?"

The search for the hypnotoxin was not unsuccessful. There are a handful of substances clearly demonstrated to cause sleep—including a molecule called adenosine, which appears to build up in certain parts of the brains of waking rats, then drain away during slumber. Adenosine is particularly interesting because it is adenosine receptors that caffeine seems to work on. When caffeine binds to them, adenosine can't, which contributes to coffee's anti-drowsiness powers. But work on hypnotoxins has not fully explained how the body keeps track of sleep pressure.

For instance, if adenosine puts us under at the moment of transition from wakefulness to sleep, where does it come from? "Nobody knows," remarks Michael Lazarus, a researcher at the institute who studies adenosine. Some people say it's coming from neurons, some say it's another class of brain cells. But there isn't a consensus. At any rate, "this isn't about storage," says Yanagisawa. In other words, these substances themselves don't seem to store information about sleep pressure. They are just a response to it.

Sleep-inducing substances may come from the process of making new connections between neurons. Chiara Cirelli and Giulio Tononi, sleep researchers at the University of Wisconsin, suggest that since making these connections, or synapses, is what our brains do when we are awake, maybe what they do during sleep is scale back the unimportant ones, removing the memories or images that don't fit with the others, or don't need to be used to make sense of the world. "Sleep is a way of getting rid of the memories in a way that is good for the brain," Tononi speculates. Another group has discovered a protein that enters little-used synapses to cause their destruction, and one of the times it can do this is when adenosine levels are high. Maybe sleep is when this cleanup happens.

There are still many unknowns about how this would work, and researchers are working many other angles in the quest to get to the bottom of sleep pressure and sleep. One group at the Tsukuba institute, led by Yu Hayashi, is destroying a select group of brain cells in mice, a procedure that can have surprising effects. Depriving mice specifically of REM sleep by shaking them awake repeatedly just as they're about to enter it (a bit like what happens to the parents of crying babies) causes serious REM sleep pressure, which mice have to make up for in their next bout of slumber. But without this specific set of cells, mice can miss REM sleep without needing to sleep more later. Whether the mice get away totally unscathed is another question—the team is testing how REM sleep affects their performance on cognitive tests—but this experiment suggests that where dreaming sleep is concerned, these cells, or some circuit they are part of, may keep the records of sleep pressure.

Yanagisawa himself has always had a taste for epic projects, like screening thousands of proteins and cellular receptors to see what they do. In fact, one such project brought him into sleep science about 20 years ago. He and his collaborators, after discovering a neurotransmitter they named orexin, realized that the reason the mice without it kept collapsing all the time was that they were falling asleep. That neurotransmitter turned out to be missing in people with narcolepsy, who are incapable of making it, an insight that helped trigger an explosion of research into the condition's underpinnings. In fact, a group of chemists at the institute at Tsukuba is collaborating with a drug company in an investigation of the potential of orexin mimics for treatment.

These days, Yanagisawa and collaborators are working on a vast screening project aimed at identifying the genes related to sleep. Each mouse in the project, exposed to a substance that causes mutations and fitted with its own EEG sensors, curls up in a nest of wood chips and gives in to sleep pressure while machines record its brain waves. More than 8,000 mice so far have slumbered under observation.

When a mouse sleeps oddly—when it wakes up a lot, or sleeps too long—the researchers dig into its genome. If there is a mutation that might be the cause, they try to engineer mice that carry it, and then study why it is the mutation disrupts sleep. Many very accomplished researchers have been doing this for years in organisms like fruit flies, making great progress. But the benefit to doing it in mice, which are extremely expensive to maintain compared to flies, is that they can be hooked up to an EEG, just like a person.

A few years ago, the group discovered a mouse that just could not seem to get rid of its sleep pressure. Its EEGs suggested it lived a life of snoozy exhaustion, and mice that had been engineered to carry its mutation showed the same symptoms. "This mutant has more high-amplitude sleep waves than normal. It's always sleep-deprived," says Yanagisawa. The mutation was in a gene called SIK3. The longer the mutants stay awake, the more chemical tags the SIK3 protein accumulates. The researchers published their discovery of the SIK3 mutants, as well as another sleep mutant, in Nature in 2016.

While it isn't exactly clear yet how SIK3 relates to sleepiness, the fact that tags build up on the enzyme, like grains of sand pouring to the bottom of an hourglass, has the researchers excited. "We are convinced, for ourselves, that SIK3 is one of the central players," says Yanagisawa.

As researchers probe outward into the mysterious darkness of sleepiness, these discoveries shine ahead of them like flashlight beams, lighting the way. How they all connect, how they may come together into a bigger picture, is still unclear.

The researchers hold out hope that clarity will come, maybe not next year or the next, but sometime, sooner than you might think. On an upper story at the International Institute for Integrative Sleep, mice go about their business, waking and dreaming, in row after row of plastic bins. In their brains, as in all of ours, is locked a secret.