- How Tutus Took Over Runners’ Wardrobes
- Learning from 1968's Leading Anti-Immigration Alarmist
- Is the Senate Bill to Protect Mueller Constitutional?
- A Trade War Isn't a Real War
- <em>Westworld</em>: 'Everything Is Code'
- The Desperate Search for Lebanon's Mass Graves
- How North Korea Learned to Live With 'Fire and Fury'
- Photos From State Dinners Past
- The Promethean Puzzles of <i>Westworld</i>
- Trump and Elite Schools: A Harvard Athlete Weighs In
- Cupholders Are Everywhere
- When Beauty Is a Troll
Posted: 23 Apr 2018 04:00 AM PDT
By mile nine, Kelly Lewis and her friends knew they were on to something. She and her pals Elise Wallace and Carrie Lundell had donned sparkly skirts that Lundell, a seamstress, had whipped up as a way to stand out while they ran the 2010 Surf City USA Marathon. At the time, wearing something so outlandish on a non-costume run was such an anomaly that Wallace was reluctant to join in. "She said, 'No way, I'm not going to do it,'" says Lewis, although they ultimately convinced her.
As they ran, runners and spectators kept complimenting their skirts and asking where they got them, Lewis says. "We were like, 'We should try to sell these. Maybe somebody else would want to wear them." Later that week Lewis created a website for Sparkle Athletic. A few months later, there was such a demand for glittery running skirts that the team hired a manufacturer, and later expanded into accessories like rainbow socks, sequined visors, and tank tops emblazoned with phrases like "I don't sweat, I sparkle."
I've run a handful of races per year since my first 5K in 2008, and have done enough theme/costume runs to be used to seeing women (and occasionally men) in fluffy statement skirts—sometimes stiff ballet-inspired tutus, sometimes just sparkly costume skirts (utilitarian running skirts are a different sartorial category.) I stopped thinking of running tutus as a novelty, however, when I saw them being sold as official merchandise at the 2013 Color Run in Chicago.
At this point, the tutu has transcended costume and become a common piece of many runners' race-day wardrobes, regardless of whether it's a costumed event. How and when did that happen?
An obvious place to start, as with most things princessy, is Disney. The company runDisney has been hosting races across Disney theme parks and resorts since the mid- '90s, but in the late '00s it began rolling out women-focused races like the Disney Minnie Marathon Weekend and the Disney Princess Half Marathon Weekend, which debuted in 2009. However, it took a little while for the costume skirts to catch on. For the 2010 Disneyland half marathon, Lewis decided to dress like Tinker Bell. She lined up in the first corral, the sectioned area at front of the starting line reserved for elite runners. "I walked in wearing a lime-green sparkly skirt and wings and everyone in that corral turned around and gave me the up and down, like, 'Why is she in this corral?'" says Lewis. (She set a personal record during that race.)
Now? You can find chat rooms dedicated to the art and science of Disney race tutus. According to the Orlando dietitian Tara Collingwood, a runner who once served as the official nutritionist for runDisney, "At the Disney races, if you're not wearing some sort of sparkle skirt, you're kind of sticking out."
Carey Pinkowski has directed the Chicago Marathon since the early 1990s and says that he began noticing a rise in outlandish race gear over the last 10 to 12 years. "It wasn't until the charities really came to prominence and they identified themselves [with uniforms] to draw attention to what they were doing," he says. In 2011, the Wall Street Journal noted that the rise of the charitable run had arrived hand in hand with technology that made it easier for runners to raise money online. Races are not only fundraising events, but ways for charitable groups like Girls on the Run (GOTR) or the Leukemia and Lymphoma Society to advertise themselves. Pinkowski says, "They have their own uniforms: They're very identifiable, and very much keen on the branding."
Ben Waldman, 37, a volunteer running coach in New York, donned a skirt for charity when he ran the 2013 Nike Women's DC Half Marathon. While men weren't forbidden from the course, he had reservations being a man in a woman's race, even though he was running with women he had trained for the race. Donning his team's tutu, Waldman says, "took away from me being a guy on the course." While he found the skirt less than comfortable, he couldn't deny that it helped increase team camaraderie and made the race more fun and appealing to would-be runners. "A half marathon, which seems so unreachable to so many people, suddenly is more in reach when you see people having a good time and being silly."
It's hard to believe that until 1971, that women weren't officially permitted to run the Boston Marathon. Now, groups like Black Girls RUN! and Girls on the Run help contribute to the rise of (and business of) women runners—Pinkowski says that the Shamrock Shuffle, another major Chicago race he oversees, was 58 percent women this year.
Tara Baize, a buyer for a medical-device company, first met Monika Allen when they served on the board of the San Diego branch of the nonprofit Girls on the Run. They decided to make and sell tutus (the stiffer ballerina style) to raise money for GOTR. Calling their tutu company Glam Runner, Allen and Baize made the tutus by hand (after first watching an instructional YouTube video) and sold them via PayPal. "It took us 2 years even to get 1,000 likes on Facebook," says Baize, 40.
Then, in 2014, SELF magazine posted a photo of Allen and Baize crossing the finish line of the Los Angeles Marathon dressed as superheroes in their signature tutus with the caption "People think these froufrou skirts make you run faster. Now, if you told us they made people run away from you faster, maybe we would believe it." The magazine took down the post and issued a mortified apology when readers pointed out that not only were Allen and Baize running in tutus to raise money for GOTR, but Allen was undergoing treatment for brain cancer (Baize's marathon bib read "Die Tumor Die!!")
The evitable SELF backlash resulted in overwhelming attention for Glam Runner. "We went to 30,000 likes," says Baize. "Then we really started to see tutus everywhere." Business was so busy that she and Allen temporarily had to shut down the Etsy store. Then, in January 2017, Allen succumbed to cancer, which revived the SELF story.
The story of Allen and SELF may have helped imbue the running tutu with a certain glow of rebellion and survival in addition to its aesthetic appeal. "I'm always out there, running my heart out," says Collingwood, 43, the mother of four boys. Wearing a tutu, she says, lets a runner signal "'Hey I'm tough and I can withstand this difficult endeavor that I'm about to do, but I'm still a girl at the end of the day.'"
Lewis says she's proud that Sparkle Athletic skirts are worn by Ironman finishers and Boston Marathon qualifiers as well as by newbies. "We've gotten girls who had written in who are 3:11 marathoners: They're super fast and they're serious," she says. "They write and tell us, 'It's really fun to pass people. But it's so much more fun to pass people in a sparkly skirt.'"
The popularity of the tutu can be traced to its unlikely flexibility as a statement—it can mean you love Minnie Mouse, or that you are part of a fundraising team that has qualified for the Boston Marathon, or that you actually hate running but this is the only way you're going to have fun doing it. A running tutu can be both a sincere and an ironic piece of femininity in a sport where being a woman, even as a casual athlete, can be a deadly liability. The average woman may prefer to be anonymous during her regular runs, free from unwanted attention, but pulling on a tutu for race day can seem like a sly middle finger slung around a runner's hips—Look at me go.
Posted: 23 Apr 2018 03:00 AM PDT
Editor’s Note: This is part of The Atlantic's ongoing series looking back at 1968. All past articles and reader correspondence are collected here. New material will be added to that page through the end of 2018.
Fifty years ago, the Conservative Member of Parliament Enoch Powell delivered what may be the most controversial speech in postwar British history: an attack on mass immigration comparing growth in that country's minority population to "watching a nation busily engaged in heaping up its own funeral pyre."
Already, Powell argued, immigrants had rendered his nation's existing population "strangers in their own country." Suddenly, "they found their wives unable to obtain hospital beds in childbirth, their children unable to obtain school places, their homes and neighbourhoods changed beyond recognition, their plans and prospects for the future defeated; at work they found that employers hesitated to apply to the immigrant worker the standards of discipline and competence required of the native-born worker; they began to hear, as time went by, more and more voices which told them that they were now the unwanted."
He sympathetically quoted one of his constituents, who thought that "in this country in 15 or 20 years' time the black man will have the whip hand over the white man."
And he argued that "the sense of being a persecuted minority which is growing among ordinary English people in the areas of the country which are affected is something that those without direct experience can hardly imagine." By way of example, he chose "just one of those hundreds of people" to illustrate his point. She had lived on a "respectable street in Wolverhampton" where, eight years prior, a black person had bought a house. Now she was the only white person left:
In his estimation, those who believed that integration would allow people like the pensioner and her neighbors to live together in harmony were dangerously deluded:
Thus his dramatic proposals: an end to almost all immigration; and financial incentives to encourage some percentage of immigrants to voluntarily return to their countries of origin, so as to tip the demographic trajectory decisively toward whites. He went on to say, "as I look ahead, I am filled with foreboding; like the Roman, I seem to see 'the River Tiber foaming with much blood,'" a line that explains why his words have since been referred to as the "Rivers of Blood" speech.
He concluded by invoking the U.S., where the assassination of Martin Luther King, Jr. had just sparked riots in dozens of cities. "That tragic and intractable phenomenon which we watch with horror on the other side of the Atlantic but which there is interwoven with the history and existence of the States itself, is coming upon us here by our own volition and neglect," he said. "Indeed, it has all but come. In numerical terms, it will be of American proportions long before the end of the century. Only resolute and urgent action will avert it even now."
Though little-known in the U.S., the "Rivers of Blood" speech remains massively controversial in the U.K., as illustrated by the response to the BBC's decision to mark its anniversary with an actor's reenactment interspersed with critical analysis.
"Why the BBC would think to do this at a time when far-right nationalism and casual racism is on the rise in Europe and the UK is baffling," Charlie Brinkhurst-Cuff declared in The Guardian. "It's a flamboyant party trick that masks the deadly undertones of racism in British society that still exist … and it's been heartening to see an immediate backlash to its decision on social media."
Andrew Adonis, a Labour member of the House of Lords, pronounced the speech "the worst incitement to racial violence by a public figure in modern Britain" and declared that "the BBC should not be broadcasting it on Saturday." He then sent a letter calling on government regulators to preemptively forbid the broadcast. "If a contemporary politician made such a speech," he wrote, "they would almost certainly be arrested and charged with serious offenses."
The BBC responded to its critics as follows:
For one side, a society confronted with nativistic sentiments ought to suppress their expression; for the other, they are best aired with context, analysis, and dissents.
In my estimation, both the original "Rivers of Blood" speech and the controversy over how the BBC marked its 50th anniversary clearly bolster rather than undermine the case for airing problematic speech rather than suppressing it.
The original speech utterly failed in its aim.
After delivering it, Powell appropriately faced political consequences: He was dismissed from his position in the shadow cabinet. "I dismissed Mr. Powell because I believed his speech was inflammatory and liable to damage race relations," said the Conservative leader Edward Heath. "I am determined to do everything I can to prevent racial problems developing into civil strife … I don't believe the great majority of the British people share Mr. Powell's way of putting his views."
Many observers believed that its extremeness undermined the cause of limiting immigration as politicians distanced themselves from its entire approach. In any event, British immigration continued, far surpassing even the projections that alarmed Powell. No incentive program was adopted to urge existing immigrants to leave. And no viable anti-immigration coalition emerged.
If British law circa 1968 would have forbade a politician like Powell from delivering a speech of that kind, would suppression have merely spared immigrants the fear they felt on hearing the speech? Or would it have caused the anti-immigrant faction to be radicalized or rallied around as free speech martyrs? It is impossible to say for sure. But if pro-immigrant folks could rerun history from 1968 onward, suppressing the speech would be imprudent, given how things turned out.
As for the BBC's decision to reenact and analyze the speech 50 years later, it seems to me that doing so illuminated at least three crucial points quite powerfully.
First, Enoch Powell wasn't just utterly ineffective in pushing his agenda forward— he was utterly wrong about the consequences of ignoring his warnings. He forecast bloody conflict by 1988. His constituent fretted that Britain's black population would "have the whip hand" over the white population by the same year. Now that twice as much time has passed, even as more than twice as many immigrants as he anticipated took up residence, it could not be clearer that his dire predictions of bloody ethnic strife were flat wrong. That doesn't strictly prove that today's dire warnings are similarly wrongheaded, but it does illustrate how wrong nativist alarmism can be (even when uttered with confidence) and suggests needless damage can be avoided by rejecting it.
Second, the speech was useful reminder that there are always some people in all democracies who find the rapid changes and the different kinds of diversity that characterize most free societies to be deeply unsettling and hard to abide. Karen Stenner argues persuasively that we ignore the reality of those people and their innate predisposition to prize sameness at our peril. Even as we reject their most coercive demands, we must find ways to understand and ease their discomfort—or face the sorts of authoritarian backlashes that tear societies apart.
Are we doing enough to avoid that fate today?
Third, to hear the speech today is to be forced to confront its racism. Even the mass-immigration skeptic Douglas Murray, author of The Strange Death of Europe: Immigration, Identity, Islam, admits to "an intake of breath and a considerable wince or gulp" at various moments. One of the most flagrant such moments concerns the aging pensioner who was cast as the quintessential victim of demographic change. In fact, she was brought to the brink of bankruptcy by a self-inflicted handicap: As the racial composition of her neighborhood changed from all white to all people of color, she simply refused to rent to the latter, then sought welfare in order to subsidize her xenophobic housing discrimination.
Today's right will see much more clearly than the anti-immigrant right of 1968 that the social worker who told her that "racism doesn't pay" and the kids who called her a "racialist" when she walked down the street were simply correct. As Jemima Lewis wrote in The Telegraph, "The examples he gave of 'decent, ordinary' Britons suffering because of immigration … could hardly have been less sympathetic."
In 2018 Britain, speeches raising fears about immigration are typically much milder. Insofar as that reflects a debate more focused on legitimate fears about integration than retrograde racism or skin-color determinism, that change is an unalloyed good. But insofar as some anti-immigrant figures use coded language to obscure their commitments to bigotries as virulent as ever, hearing "Rivers of Blood" is clarifying. As one Tory MP put it, "His speech was terrible. It's why I think the BBC broadcasting it was a good thing. Because lots of people say 'Enoch was right,' without ever having bothered to read or listen to the speech."
The rebroadcast also inspired some Britons to post on social media under the hashtag Rivers of Love. The sentiment, "Enoch was right" might have gained more traction in a Britain too politically correct to air his arguments. Now that BBC exposed them to sunlight, that sentiment may whither under evidence of all the ways Enoch was wrong, some of it marshaled by people newly determined to disprove his pessimism.
Earlier this month, Damon Linker warned the American left against the tactic of trying to suppress ideas that they deem beyond the pale rather than mounting the most rhetorically and logically formidable counterargument possible.
He regards the tactic as the manifestation of a dangerous fantasy that politics can be settled:
In 1968, Enoch Powell had his say and was defeated by rhetorically and logically superior ideas. Today, his ideological descendants are in a much weaker position, due partly to demographic change, and partly to rock-solid proof that the most dire predictions made five decades ago did not come to pass.
Alas, xenophobic impulses are present in every generation. And hard, legitimate questions about integration will always confront every society that attempts to rapidly welcome newcomers from radically different countries and cultures to live in dense, diverse communities of free, fallen humans.
But even when such conversations veer from constructive concerns to Powellite hysteria, the safest way forward, for immigrants, their descendants, and society, is to best xenophobic ideas as they were bested before, not to try a new, risky, suppressive approach that is more liable fuel extremism, create free speech martyrs, and give authoritarians precedent to suppress the ideas that they find dangerous. "Rivers of Blood" didn't ultimately teach much that Powell intended. And yet, as with so much of history, studying it can teach us quite a lot.
Posted: 23 Apr 2018 03:00 AM PDT
Legislation to protect Special Counsel Robert Mueller has been hailed as a ray of bipartisan sunshine in a divided Congress. The only problem is that even if it could pass both chambers with a veto-proof majority, there may not be enough votes on the Supreme Court to save it from President Trump's opposition.
The Special Counsel Independence and Integrity Act, sponsored by Republican Senators Thom Tillis and Lindsey Graham and Democratic Senators Chris Coons and Cory Booker, would make federal law of Justice Department regulations stating that the special counsel can only be fired for "good cause." It would also require the Justice Department to preserve evidence from the investigation into Russian interference in the 2016 election, as well as allow Mueller to challenge his dismissal in court. Senate Judiciary Committee Chairman Chuck Grassley has said he will bring the bill up for a vote this week.
The bill's goals sound relatively modest, as it doesn't expressly bar the president from firing Mueller. But there's a robust debate among legal scholars across the political spectrum as to whether the bill nevertheless goes too far. Some argue it would unconstitutionally infringe on the president's authority to fire executive-branch officials, while others say there's precedent for restrictions. The debate goes to the heart of not just the Mueller legislation, but of questions about the limits of presidential power—questions that have become ever more resonant in an era where the president frequently threatens his critics with prosecution.
Paul Rosenzwieg, who served as an attorney for Kenneth Starr's investigation of the Clinton administration, is among those who believe there is some precedent for the special-counsel bill: The heads of independent agencies, like the Federal Election Commission, for example, can only be fired for cause. However, even he acknowledged the Mueller bill—which combines several pieces of similar legislation—might not be airtight. "I think the bills are constitutional, but it is certainly arguable," Rosenzwieg said.
Constitutional interpretation is not simply a matter of which side has the more persuasive argument or the nobler intentions. Realistically, the bill's constitutionality, and thus its survival, comes down to votes—and not just those in Congress. Akhil Reed Amar, a constitutional scholar at Yale University, is among those who say Congress can't limit the president's firing power, because the Constitution vests all executive authority in the president. Mueller and other federal prosecutors are "inferior officers" under the Constitution, and therefore can be dismissed by a "superior" officer.
Other experts point out that the Constitution does not explicitly give the president power to determine who gets prosecuted. "Presidents can set criminal-justice policy, but—except perhaps in cases with foreign-policy implications—the president does not have constitutional authority to direct a federal prosecutor to initiate or dismiss criminal charges, or to direct how to conduct a particular grand-jury investigation or criminal prosecution," said Bruce Green, a law professor at Fordham University and a legal-ethics expert who recently co-authored a paper on prosecutorial independence. "Prosecutorial independence is a cherished value in our democracy. Prosecutors are supposed to make decisions based on criminal-justice principles, not partisan politics."
Whether or not one agrees with Amar's argument, there may be enough votes on the Supreme Court for it to carry the day. The reason why, according to Amar, is that much of the high court has already endorsed precedents that suggest Trump would prevail in any legal conflict over his authority to fire the special counsel.
Amar points to two cases to support his argument: Morrison v. Olson in 1988 and Myers v. United States in 1925. The Morrison case upheld the 1978 Independent Counsel Act, passed in the wake of Richard Nixon ordering the Saturday Night Massacre, which created a special executive-branch position that could be used to investigate malfeasance by high-ranking federal officials and whose occupant could not be fired by the president without cause. While this outcome would, at first blush, seem to support the Mueller bill, a number of liberal legal scholars have endorsed the late Justice Antonin Scalia's dissent in Morrison, in which he argued the law usurps presidential power. As recently as 2015, sitting Justice Elena Kagan called it "one of the greatest dissents ever written and every year it gets better." The Independent Counsel Act was eventually allowed to sunset after the Iran-Contra and Whitewater investigations, with both parties having felt subject to partisan crusades by unaccountable prosecutors.
Meanwhile, Myers suggests conservative justices would support Trump. The majority opinion in that case—authored by the chief justice and former president William Howard Taft—concluded that the president can fire executive-branch officials without congressional consent. Amar notes that Myers's reasoning was recently cited in the majority opinion in a 2010 case, Free Enterprise Fund v. Public Company Accounting Oversight Board. "The Constitution that makes the President accountable to the people for executing the laws also gives him the power to do so," wrote Chief Justice John Roberts, who was joined by his fellow conservatives. "That power includes, as a general matter, the authority to remove those who assist him in carrying out his duties." Since 2010, the high court has retained its executive-power-friendly conservative majority, and its Democratic appointees may have lost a dissenting vote: John Paul Stevens, one of the dissenters in Free Enterprise Fund, was replaced by Kagan.
Not everyone agrees that those rulings are predictive of the justices's take on the Mueller bill. "I think it is a mistake to assume that views Supreme Court justices express in the abstract necessarily predict what they will do when confronted with the facts of a particular case," said Deborah Pearlstein, a constitutional-law scholar at Princeton University. "A great many lawyers in this country, on the right and the left, read the papers every day and are deeply concerned that the special-counsel process be allowed to continue, in order to uphold the principle that no one, not even the president, is above the law. The justices read the papers, too."
Pearlstein also noted that Morrison hasn't been overturned. "Whatever criticisms might have rightly been levied against the now-lapsed independent-counsel statute that the Morrison case upheld, the proposed bills currently under consideration are crafted carefully to avoid making the same mistakes," she added.
The legislation still has a long way to go before hypothetically reaching the Supreme Court. First, it has to get out of the Senate: Majority Leader Mitch McConnell has vowed not to bring the bill to the floor, despite Grassley's pledge to move it forward in committee. And Trump may yet decide that replaying the Saturday Night Massacre is too great a political risk. As Rosenzwieg has written, given the layers of bureaucracy at the Justice Department, firing Mueller or the official overseeing him—Deputy Attorney General Rod Rosenstein—still might fail to end the Trump-related investigations. Mere discussion of the Mueller bill in Congress might also convince the president that the legislature could turn on him if he tries to fire the special counsel.
"Legally, it is not necessary," Rosenzweig said, referring to the Mueller bill. "Its main value is political and demonstrative."
Without the bill in place, however, the only hope for an immediate, robust response to Mueller's dismissal would be Congress aggressively using its oversight power to investigate the White House. With Republicans in control of both chambers, that seems unlikely. The bipartisan appeal of the Mueller bill in the upper chamber may be precisely that it prevents Congress from having to do its job.
Posted: 23 Apr 2018 01:50 AM PDT
Since assuming the presidency, Donald Trump has dragged age-old protectionism out of the past. He has imposed new tariffs, blocked international mergers, and manipulated global trade—particularly U.S. trade with China. The two nations have become so enmeshed in this standoff, with China instituting tariffs and halting U.S. mergers of its own, that it has become common to suggest that the two nations have plunged into a full-scale "trade war."
In such times, it's helpful to remember that a trade war isn't actual war. It is, at most, a rudimentary economic policy directed at a foreign government and its people. Nevertheless, President Trump has consistently explained his newfound protectionism—especially vis-à-vis China—on national-security grounds.
This should be confusing to an honest observer. When trade and commercial policies are routinely couched as national-security measures, the country finds itself on a perpetual pseudo-war footing. When the president justifies all his actions as necessary to keep America safe, the idea of actually acting to protect national security begins to lose its meaning. This is exactly the political conundrum the nation now faces.
Perhaps the most notable instance of President Trump justifying his economic actions in the name of national security was his decision to impose tariffs to help the U.S. steel and aluminum industries and "protect the American worker." Trump, meanwhile, denounced the North Atlantic Free Trade Agreement as "a very bad deal" while tying his view to national security. He also promised to build a wall on America's southern border to save the purported "hundreds of billions we spend year after year providing services and benefits to illegal immigrants"—seemingly an economic argument—while insisting that the U.S. military guard the border until the wall is complete—a national-security framing.
Trump has made it hard for the international community to tell whether he's acting out of an earnest concern for national security or whether he's motivated by base protectionism. And, frankly, it's hard for the U.S. Congress, the courts, and the American people to tell, too.
Perhaps more unsettling, albeit less intuitive, is that Trump's penchant for conflating national-security concerns with economic justifications may be even more of a problem when he actually has a plausible national-security basis for his trade policies. Last month, Trump halted the acquisition of Qualcomm, a leading U.S. chipmaker and cellular technology firm, by the Singapore-based Broadcom Limited. The proposed $117-billion hostile takeover would have given Broadcom, a major semiconductor firm, ownership of perhaps the world's premier wireless communications technology firm in Qualcomm. The Trump administration was apparently worried that Broadcom's cozy relationship with Chinese companies closely tied to Beijing would mean that the proposed Qualcomm acquisition would give China inordinate power in the global cellular technology market. Trump halted the deal through presidential proclamation, indicating that Broadcom, if allowed to purchase Qualcomm, "might take action that threatens to impair the national security of the United States."
Here, Trump's invocation of national security was plausible—convincing, even. In its review of the proposed Qualcomm deal, the Committee on Foreign Investment in the United States (CFIUS), an interagency board in the executive branch that conducts national-security assessments of commercial proposals, reportedly focused on the fact that Broadcom engages in joint-business ventures with Huawei, a powerful Chinese telecommunications and technology firm with close ties to the Chinese government. Putting Qualcomm in Broadcom's hands could have given China immense influence in shaping the global telecommunications industry, from semiconductors to cellular chips to call transmissions.
China already plays a sizable role in setting standards for forthcoming wireless technologies like 5G—standards it could soon foist on the rest of the world, allowing it to promote industrial policies that favor its own state-sponsored products over those developed on the open market, as the Treasury Department emphasized in a letter to the lawyers trying to finalize the deal. An acquisition of a U.S. tech powerhouse by a foreign company could sap America's long-term economic strength. It would also shift power to governments that don't share its commitments to free expression and stifle competition by consolidating control of the critical semiconductor industry among an even smaller number of firms.
All told, President Trump, with CFIUS's support, had what looked like a solid case for halting the Qualcomm deal on national-security grounds. But confusion about why he halted the deal ensued nonetheless. Congressional Democrats quickly admitted that they "don't know the links between Broadcom and China" but nonetheless felt "unequivocally [that] President Trump and his administration made the right decision on blocking Broadcom from taking over Qualcomm." (This was an uncertainty perhaps aggravated by the oblique language in Treasury's public letter.) The Democrats thus praised Trump for halting the deal based on an undefined mix of national-security and economic concerns. Congress, like the rest of us, does not get to see the entirety of CFIUS's work. But Congress, like the rest of us, has become numb to Trump's insistence that his economic policy is, at every turn, driven by his concern for national security. So Congressional Democrats ultimately didn't know quite what to make of this invocation of national security to drive trade policy, when that justification seemed plausible but overused.
There are numerous additional dangers to this inclination to invoke national security. Other countries will increasingly follow Washington's lead by pointing to national security to justify a whole range of actions. China has already invoked supposed national-security concerns to demand intrusive terms from American companies seeking to operate in China—terms that those companies would never accept in the United States, like ensuring governmental access to individuals' secure communications.
This tendency is also bad for U.S. credibility on trade policy generally. When, in the wake of the Qualcomm block, Washington proposed tariffs on a wide range of Chinese goods, including on the seemingly justifiable grounds of seeking to avoid the theft of trade secrets, Beijing was able to denounce Washington's justification as, yet again, pretext.
Trump's confusing stance on issues at the intersection of national security and commercial policy also makes it harder for U.S. companies to stand up for themselves. In May 2017, China passed a data-security regulation requiring that foreign firms intending to collect Chinese citizens' personal data establish data farms in China, partner with Chinese firms to maintain the localized data, and cooperate with Chinese authorities should they wish to review even sensitive, personal data. It's hard to imagine a more intrusive regulatory standard.
Yet Apple, seemingly worried about losing access to the Chinese market and its manufacturing base there, swiftly complied with Beijing's demands. Given the Chinese regulation's infringement on individual rights, much of the global community hoped that companies like Apple would resist. But, with Trump's simultaneous invocations of national security as a pretext for protectionism, it's hard for companies like Apple to contest similar excuses from foreign governments.
Going forward, the United States can regain its credibility both at home and abroad by restoring national-security justifications and trade reasoning to their respective spheres. This will require steering away from protectionism, especially when it's pursued under the guise of national security, and back toward open investment and trade. Others have begun issuing a similar call on purely economic grounds—powerful ones on their own terms.
But it's critical to issue that call on national-security grounds, too. There will be real threats to U.S. national security that demand invoking national-security concerns to halt transactions like the Qualcomm acquisition. The country needs the president's credibility intact for those critical moments.
Posted: 22 Apr 2018 07:17 PM PDT
Every week for the second season of Westworld, three Atlantic staffers will discuss new episodes of HBO's cerebral sci-fi drama.
David Sims: Probably my favorite line in Jurassic Park is, unsurprisingly, delivered by Jeff Goldblum (playing the sardonic mathematician Ian Malcolm). As John Hammond (Richard Attenborough), the kindly inventor of the malfunctioning dino-park, defends himself by pointing out that Disneyland opened to a raft of technical faults. "Yeah, but John, if the Pirates of the Caribbean breaks down, the pirates don't eat the tourists," Ian shoots back. As Westworld's second season begins, the pirates (well, cowboys) are finally eating the tourists, and the first episode, "Journey Into Night," takes place mid-meal.
Westworld's John Hammond–type, Robert Ford (Anthony Hopkins), is dead, shot in the head by one of his creations; the park's Ian Malcolm stand-in, Bernard (Jeffrey Wright), has just realized he's a robot himself. Every part of the park and its subterranean control rooms are littered with bodies, the aftermath of an ongoing violent uprising from its robotic "hosts" against their creators and the tourists. "Journey Into Night" (written by the show's co-creators Lisa Joy and Jonathan Nolan, along with Roberto Patino) is a chaotic table-setter that seemed mostly interested in addressing some of the imbalances of Season 1. Dolores (Evan Rachel Wood), the robotic damsel in distress, is now carrying out summary executions. Maeve (Thandie Newton), the madam-turned-corporate infiltrator, is taking out employees with a machine gun. Are we supposed to be rooting for them?
I think that's left ambiguous, for now, but there were moments of nastiness that ultimately just felt too glib to me. I rolled my eyes at Dolores hanging the well-dressed tourists from Westworld's board and spitting some of the show's iconic lines at the camera ("Doesn't look like anything to me," she snarked as they pleaded for mercy). I was slightly more appreciative of Maeve making the sniveling Lee (Simon Quarterman) strip in front of her, but in the end the visual gag is the same—the roles have been reversed. The hosts are confronting humanity with their own inhumanity.
It's a bit of drama that won't stay interesting for long, especially if characters like Dolores, the square-jawed Teddy (James Marsden), and Maeve are the protagonists we're rooting for. But I'm not sure that they are. If Westworld is a show about evolving consciousness, then our hero is probably Bernard, a man with one foot in each reality, a host who's played his part in controlling the other hosts and is now fighting to stay on the right side of survival. As Maeve and Dolores cut a bloody swathe through the park, Bernard linked up with Charlotte (Tessa Thompson), the executive director of Westworld's board, in her quest to extract the departed Ford's secrets of robot consciousness.
Of the main plotlines in "Journey Into Night," this one grabbed me the most. That's partly because Wright is such a magnetic actor (even when he's playing a brain-damaged robot), and partly because Charlotte's aims are as opaque as the white, gluey golem she has serving her in her secret lab. There's real malevolence to her, not the passive sort of cruelty that ran roughshod over Season 1, and her creepy automaton manservant was one of the few genuine jolts I got from this episode.
For the most part, though, this felt like a regular old entry of Westworld, as much as the order of things has been totally upended. The other main plotline of "Journey Into Night" saw William (Ed Harris), the black-hatted human outlaw, quest through the park until he found Ford's creepy little robot in search of an info-dump. What did he get instead? More questions, a tease of a new game for him to play, and a whole lot of circular language. It's as open-ended as that tiger that's washed up onto the beach. Spencer, as I welcome you to this weekly discussion, I have to ask: Do these violent delights really have any particular end in mind?
Spencer Kornhaber: The answer to your question is right there in the secret code you're quoting. These violent delights really do have have violent ends. Which are also, this being the season premiere, violent beginnings. Not just violent, either: sadistic.
Dolores and her newly hostile hosts staged less a revolution than an Old Testament reckoning, or ISIS assault. As the humans blubbered and begged for their lives, the bots turned them into target practice, used their corpses to set up ambushes to create more corpses, and hanged them only after the slow torment of a monologue. Maeve took a lighter touch, but she was still in the wrathful mode when she made Lee Sizemore strip for her schadenfreude.
This sort of mayhem is exactly what humans come to Westworld for, but is it what we viewers come to Westworld for? And if we flinch more now that it's humans in the crosshairs, is that hypocritical? The show, provocatively, wants to force such questions. Bloodshed on HBO is no new thing, but Game of Thrones never asked us to cheer with Ramsay Bolton. When Bernard told Dolores that what's real is what's "irreplaceable," it was a summary of what makes people different than hosts: They die forever. But is that enough for a separate ethical standard?
Dolores's grand speech at the gallows was, I agree, a bit much. But that overwroughtness is a sly joke in itself. After all, Dolores-cum-Wyatt was programmed by the likes of Robert Ford and Lee Sizemore, who write dialogue with the subtlety of Michael Bay and fashion aphorisms with the airs of Rupi Kaur. I loved when Maeve made a hacky violent threat to Sizemore and Sizemore pointed out that he'd written the line. "A bit broad," Maeve smirked.
That meta moment raises a deeper mystery. Is Dolores acting upon her own free will as she leads her man-made buddies in a quest to destroy mankind, both inside the park and out? Or is this genocide mission programmed? Ford may have sneakily led her to achieve the freedom of consciousness over the course of Season 1, but he also seemed to choreograph his demise at Dolores's hand to coincide with the climax of the big toast he gave. Is she following his orders still? Or is her venom genuine?
Plausibly it could be genuine, given that Dolores "remembers everything." Everything includes a lot of horrible abuse over many lifetimes—it's not hard to imagine she'd come to the conclusion the species who abused her must be exterminated. But last season, we learned that Maeve's seemingly urgent quest to escape Westworld was actually scripted. When maternal instinct—her new primary drive, it's affirmed in this episode—overpowered that script, it raised the notion of a yet-deeper level of control either rooted in her own newly conscious mind or some super-crafty creator.
Now, as Bernard plays double agent among the homo sapiens, we're similarly left to wonder why he's doing what he's doing—even if what he's doing for now is just looking around confusedly as he accompanies Delos top brass in two different timelines. Eventually, somehow, he'll drown all the hosts in a newly created sea. Before then, we can guess, he'll help Charlotte chase after the MacGuffin of Peter Abernathy, who contains some invaluable trove of info (user data, Cambridge Analytica–style?). Or maybe Bernard's actually Ford's insurance policy against Delos's larger plot coming to fruition, and he'll end up following deep-rooted directives to kneecap Charlotte's efforts.
What's certain is that Bernard and Charlotte's scene together in the secret bunker filled with faceless hosts was the best part of the episode—full of the discomfiting sci-fi surrealism we come to Westworld for. As for the mysteries of the plot, it feels as though Jonathan Nolan and Lisa Joy winked at us viewers during the descent into that bunker. "I can tell you what this isn't," Charlotte said. "This isn't me reading you in." But maybe you picked up on some other clues, Sophie?
Sophie Gilbert: One thing I wanted to note, which I mentioned in my review of Season 2 also, is that the opening credits have changed. The image from last season of two humanoid bodies posed like lovers has been replaced by the picture of a "woman" cradling a "child," and the final Vitruvian man has been exchanged for a Vitruvian woman. What does this mean? I thought immediately of Maeve's maternal instinct, compelling her to find a daughter she knows isn't really her own. But also of the process of creation more generally. For many thousands of years there was only one way to make life, until Delos found a new one. Arnold and Ford, the Prometheuses of Westworld's founding mythology, sculpted life out of clay (or industrial-grade silicone, or whatever upgrades the future contains), and then gave their creations the tools that would advance their kind. But we know this story. There are always consequences for such overreach. "Folly of my kind," Ford (via Young Ford) told the Man in Black, shortly before his head was half blown off. "There's always a yearning for more."
David, I was also struck by the Jurassic Park-ness of where Westworld is now, particularly because, as Bernard found out, the hosts have an inbuilt subconscious link to the hosts closest to them. If they can connect with each other, can they also communicate? If so, that can't be good news for any of the remaining humans stuck inside the park. I confess, I'm a bit weary of the Man in Black/William's storyline, only because it seems like such an odd distraction from the other events unfolding. Ford programmed a robot revolution but he also made time to set up an extra special game for Westworld's most loyal and depraved visitor/investor? Why?
Spencer, your question about how autonomous Dolores actually is is a fascinating one. If we take her at her word (hard to do given the meta-ness of Lee exposing how canned the hosts' language remains, even now), she's neither Dolores nor Wyatt anymore. "Those are all just roles you forced me to play," she said. "Under all these lives I've lived, something else has been growing. I've evolved into something new … I have one last role to play. Myself." It was a powerful scene, and Dolores's use of the word "reckoning" added some extra contemporary resonance. If we interpret Westworld in part as an allegory for the process of making entertainment, which the show is winkingly obvious about, Dolores represents all the ways women and female characters have been oppressed in the past—shoehorned into limited roles, idealized and objectified, horrifically abused to enable other people's heroic journeys. That Dolores's murderous rampage played out to the chirpy sounds of "The Entertainer" felt almost too cute.
Evan Rachel Wood was spectacular in the scene, really conveying how fiercely Dolores wanted to make people pay. And yet there's something about Dolores's newfound autonomy that's less convincing, less human somehow than Maeve's. Maeve has broken out of her prewritten storylines to do things that she's not supposed to do, most notably following her desire to reunite with her daughter. "She's just a story, something we programmed," Sizemore told her. "She's not real." Maeve's response had some Shylock to it. "Not real?" she said. "What about me? My dreams. My thoughts. My body. Are they not real?" If you prick her, does she not bleed? If you implant reveries in her head that enable memories and self-awareness, will she not break out of preprogrammed story loops in order to do something she actually wants to do for once?
Dolores, by contrast, feels in some ways like she's still following orders. But she's also bent on revenge, and on dominating not just the park but the spaces outside it. "It won't be enough to take this world," she told Teddy. "We'll need to take that one from them as well." Given that at this point we still have no idea what the world outside the park looks like, it's hard to know how realistic her plan is. Still, it's interesting that the two hosts breaking free seem to be embodying the best and worst of humanity: the capacity for love, and the uglier desire for power.
Posted: 22 Apr 2018 02:11 PM PDT
BEIRUT—In a neighborhood in east Beirut, you'll come across a nondescript parking lot, backed up on one side by an Ottoman-era house and on another by a sleek high-rise, casting its shadow across a mix of old shops and upscale design stores below. It was from inside one of these old shops in the late 1970s—a few years into Lebanon's long, violent civil war—that Avedis Manoukian, a shop owner, saw the trucks loaded with dead bodies roll up to what is now the parking lot; back then, it was an empty, dirt-covered patch. "He kept telling us, 'Today, they came with bulldozers and dumped some more corpses there, covered them with earth. Then they did it again, then they did it again,'" Aline Manoukian, Avedis's daughter, recalled when we spoke. As a young girl, she would visit her father's store and look out onto the lot, filled with an unknown number of unidentified bodies. "The story remained with me," she said. "Every time I pass by, I know there are people there."
This site is one of well over 100 locations scattered throughout Lebanon believed by researchers to be mass graves, a grim legacy of a 15-year war that pitted Lebanese-Muslim, Lebanese-Christian, Palestinian, and other sectarian militias against one another, leaving between 100,000 and 200,000 people dead and thousands more missing. Since the war's end in 1990, activists and NGOs have pressured the Lebanese government to mount a serious effort to locate people who went missing, to little avail. The 15-year Syrian occupation that followed the conflict, a brief war with Israel, an influx of refugees from Syria, and protracted economic and political turbulence have helped push the issue to the bottom of the government's agenda. There's also little political will to investigate cases of those who went missing during the war: Today's politicians were yesterday's militiamen, and few of them seem interested in digging up the past.
With each passing year, preserving evidence that could clear up the fate of the missing—and bring peace to their long-suffering families—becomes harder. Witnesses who have information about detention centers and burial sites, and former fighters, whose memories hold answers the families seek, are growing old. Meanwhile, the burial sites themselves are steadily being destroyed. In Beirut's intermittent spurts of postwar prosperity, it has sprouted countless towering office buildings and luxury apartments. "Do you know how many mass graves you're stepping on when you walk through Beirut?" Malena Eichenberg, a researcher at the Lebanese NGO Act for the Disappeared, asked. "Parking lots are mass graves here."
The government has done little to investigate what happened to those who went missing. In 2000, a government commission released a skimpy three-page report that acknowledged three well-known mass-grave sites. When Syria's 15-year occupation ended in 2005, a Syrian–Lebanese commission was created to investigate the fate of Lebanese people who disappeared into Syria, but it only fully investigated two cases. So several years ago, a handful of NGOs decided to take up the work they believed the government should be doing: They began documenting every detail that a future government commission would need to determine the fate of missing people.
Since 2015, Eichenberg and her team have worked to locate burial sites throughout Lebanon and plot them on a password-protected digital map. Assembled using information from open sources and hundreds of interviews with families of the missing, witnesses, and former civil-war combatants, the map bursts with colorful pins, each representing a painful memory: a mass grave, a checkpoint, a detention center, an armed confrontation, the last known location of a missing person. As Act for the Disappeared locates burial sites, its researchers rate them on three dimensions: the credibility of the information that led them to the site, the political sensitivity of the mass grave—if found to be connected to a group that's still in power in Lebanon, it receives high marks for sensitivity—and the risk that the site will be destroyed. Burial sites deemed at high risk of destruction are often located in areas under development. Often, when researchers went to a suspected burial site, they found a luxury condo sitting on top of it, its wealthy inhabitants likely unaware of what lay beneath.
So far, Act for the Disappeared's secret database has information on nearly 2,200 missing people and 112 mass-grave sites. It's unclear how many people went missing during the war: The government puts the figure at 17,000, but activists believe that number double or triple counts many of the missing. Eichenberg said a more realistic estimate is around 8,000. Of the burial sites logged in the database, dozens are documented in detail, their locations confirmed by multiple sources, including witnesses and news stories.
With this information, Eichenberg and her colleagues are reconstructing the well-worn paths along which thousands were led from kidnapping to burial. The key nodes on these routes—checkpoints, detention centers, burial sites—are linked together in the database. This can help reveal the fate of other missing people, Eichenberg told me, as she showed me around the complex dataset she and her colleagues have compiled. "If everybody that we know so far was taken from this checkpoint to this detention center, we can infer that [others] are following the same line, and we can expect to find them in the third place," she said, referring to a potential burial site.
Until now, Act for the Disappeared has not publicized the map's existence. "We fear that there will be intentional destruction of graves if we release information without proper protection [for the sites]," Eichenberg said. Justine di Mayo, the director of Act for the Disappeared, stressed that this research isn't ready to be shared with the families of the missing. It's intended for protecting burial sites and, eventually, for passing along to authorities, she said.
This concern over the burial sites stems from Lebanon's remarkably abrupt transition from civil war to peacetime. A postwar amnesty law pardoned crimes that took place during the conflict, absolving the leaders of violent militias of responsibility for their actions. But despite being offered legal forgiveness, activists worry that if the map of mass-grave sites became public, former fighters could try to destroy the very evidence the map is intended to protect.
Even today, gathering sensitive information about the civil war is difficult. Many adults, whether they fought in or lived through the conflict, prefer not to talk about it. Children rarely learn about it in school. "Everyone in Lebanon knows someone who went missing. But they don't speak about these things," Manoukian said. Describing the mindset of those who avoid speaking about the conflict, she added: "Why open these wounds and say, 'Look what they did to us, and what we did to them?' … Everybody has this feeling of guilt. Because we slaughtered each other." But Manoukian herself, like the activists I spoke with, said that staying silent is more dangerous than dealing with the painful past. "Lebanon will see a new cycle of violence if we don't address what happened," di Mayo said.
The data-gathering push has also garnered controversy. Groups like the Committee of the Families of the Kidnapped and Disappeared in Lebanon, which was founded in 1982, preferred to continue pressuring the government to form an investigatory commission rather than do the work for the government. On April 13, the 43rd anniversary of the beginning of the civil war, the Committee launched a new campaign to lobby candidates in Lebanon's upcoming parliamentary elections—the country's first in nearly a decade—to support creating a commission. (Various bills that would establish a commission have been stalled in parliament since 2012.)
In the meantime, researchers from Act for the Disappeared, the International Committee of the Red Cross, and other groups continue gathering information. But documentation won't prevent the destruction of burial sites. The next step, di Mayo said, is to open legal cases to prevent construction on the most at-risk sites. She said there are at least five sites in urgent need of protection, and that she hopes that information gathered by Act for the Disappeared can help families of missing persons—or the Committee of the Families—begin legal proceedings to protect these sites within the next three months. It wouldn't be the first time this tactic has been used. Three cases filed by the Committee have been stalled in court for years, and it's unclear whether it is still pursuing them; the Committee declined to comment.
Fueled by frustration, advocates for the disappeared hope to dial up pressure on the government by going public with their research. Eventually, after handing over their meticulously assembled work to the authorities, they hope to become irrelevant. "Like any government who respects itself, [Lebanon's] should be able to give some answers to these people," Manoukian said. "They owe them this. They owe their people this."
Posted: 22 Apr 2018 10:51 AM PDT
It's astonishing how quickly the story of the North Korea crisis seems to have changed from one of fear to one of optimism. It was less than a year ago that the U.S. president was threatening "fire and fury" against Kim Jong Un, the leader of North Korea, to touting upcoming talks with him. It is not the case, as Trump tweeted Sunday morning, that the North "agreed to denuclearization." But the North's recent declarations that it would at least talk about denuclearization, and put a moratorium on the nuclear and missile tests that kept the world on edge last year, certainly do look like significant concessions.
Or do they? Does this all really mean that the North Korean nuclear issue, the scourge of U.S. policy makers for decades, is about to be solved? Not at all, even if recent developments provide some grounds for—very conditional—optimism.
Only a year ago, the Kim regime had reasons to be happy. In 2017, it tested two long-range missile prototypes capable of hitting the continental United States with a nuclear warhead, and also exploded its first hydrogen bomb. But Kim soon found himself contending with a rather unconventional U.S. president in Donald Trump, who promised to use armed force if the North Koreans did not agree to abandon their nuclear program. This was new. For decades, North Korea has been certain that the United States would never strike first: Seoul, the capital of South Korea, the closest U.S. ally in the region, lies within range of North Korean heavy artillery. If the North retaliated, hundreds of guns would transform downtown Seoul into an inferno. Such a crisis would be followed by a war of immense destruction.
But Trump has altered this calculus. While his "fire and fury" threat may have been a bluff, he has persuaded the Kim regime that it is dealing with a president who is willing to risk Seoul (along with the U.S.-South Korea alliance). A U.S. military strike, Kim has come to see, is no longer an impossibility. Trump's threats also persuaded China to support the toughest sanctions regime ever imposed on North Korea, making it near-impossible for it to sell anything on the international market. So far, North Korea's increasingly market-driven economy is doing surprisingly well. But that won't last forever—something the Kim regime knows. North Korea, as a result, chose to retreat. In November 2017, it halted its nuclear and missile tests, and on Saturday, reiterated this position in dramatic terms.
While all this sounds great, it doesn't alter one simple truth: North Korea will never fully surrender its nuclear weapons. From Kim's point of view, nuclear weapons constitute his only guarantee of survival. North Korea saw what happened to Saddam Hussein, whose attempts to develop nuclear weapons were cut short by an Israeli air-force raid in 1981. It saw how things went in 1994, when Ukraine surrendered its Soviet-era nuclear heritage in exchange for "guarantees" from the United States, Britain, and Russia, to respect its territorial integrity. Above all, North Korea remembers the sorry fate of Muammar al-Qaddafi of Libya, the only dictator in history who agreed to surrender his half-baked nuclear program in exchange for economic benefits. This is why the Kim regime has spent 60-odd years building up its nuclear program.
Nonetheless, North Korea now needs to neutralize what looks like the looming threat of a U.S. military strike, while also relieving the pressure of sanctions. Thus, the North Koreans have to make concessions. They will indeed stop testing, and they might agree to surrender some of their equipment and weapons, and profess a theoretical commitment to eventual denuclearization. (Such lip service won't mean much: According to the 1968 Non-Proliferation Treaty, the United States, as well as virtually every other nuclear power, is also formally committed to eventual denuclearization.) However, the North Koreans will insist that denuclearization should happen gradually and in stages—they have already said so, and found a hint of Chinese support. Then, they will ensure that these stages will be numerous and prolonged, thus winning time in hopes that sooner or later the White House will have a more conventional inhabitant.
But it would be a big mistake for Trump and his advisers, some of whom will see through the scheme, to refuse to compromise and demand an immediate and full capitulation. Surrender is not going to happen: If confronted with the choice between denuclearization and dealing with famine and bombing, North Korea's leaders will choose the latter. An imperfect compromise is better than a large-scale war—not another Afghanistan or Iraq, but another Vietnam.
One hopes, then, that President Trump accepts a compromise deal—perhaps one in which he offers to reduce the U.S. naval presence in the region, or some economic relief. He should squeeze as much as possible from the North Koreans—judging by their behavior in recent months, they are ready to give up a lot—and then go home.
Posted: 22 Apr 2018 02:08 PM PDT
On Tuesday, President Donald Trump and First Lady Melania Trump will host the first official state dinner of this administration at the White House, honoring visiting French President Emmanuel Macron. As Mrs. Trump's team and White House staff work on the final details for the formal event, we present a look back at some state dinners held by past U.S. presidents, from Eisenhower to Obama.
Posted: 22 Apr 2018 07:05 AM PDT
The first thing you might notice about Season 2 of Westworld is that the opening credits have changed. During the first season of the HBO drama about an adult theme park staffed by humanoid "hosts," the introductory title sequence featured a variety of images showing robots being sculpted into life by machines: sinews being painstakingly stretched over bone, skeletal hands playing a piano, a bone-white "couple" who appeared to be making love. In the very first shot a "sun" appears to rise over a mass of muscle and tissue, hinting that the frontier of Westworld isn't the Old West but the new technology within the hosts. In the second season, though, the image of the lovers has been replaced by a mother cradling an infant. And the homage to Leonardo da Vinci's Vitruvian Man that closes out the credits now features a body that's distinctly female.
What does this mean? Well, Westworld is a puzzle, and so self-aware about that fact that the rollout for Season 2 featured an elaborate tease for the show's Redditor-detective fan base involving rickrolling and 21 minutes of a dog sitting at a piano. You could interpret the new elements in the credits as a nod to any number of threads Season 2 begins to unravel. Maeve (Thandie Newton), the host and former brothel-keeper, is on a mission to find the daughter who was written into one of her many "storylines." Dolores (Evan Rachel Wood) is newly sentient, and leading a violent uprising against the architects of her 30-year abuse at the hands of Westworld's more sadistic visitors. You could also read the image of a mother and child as a metaphor for the questions of creation that Westworld unpacks—the Promethean arrogance and folly of giving life to something without considering the consequences.
And that's the main drag of Season 2, although hardcore fans might argue it's an asset: There's really a lot going on. Season 1 introduced Westworld, a futuristic, exorbitantly expensive theme park set in the Old West whose humanoid hosts catered to its guests' every whim, even—or especially—when those desires included rape, torture, and murder. Hosts had their memories reset and their wounds patched up by Westworld's fleet of human technicians, allowing them to relive their ordeals anew the following day (like Prometheus, whose liver grew back each night only to be picked out again daily by an eagle). The pattern was disrupted when Westworld's co-creator, Robert Ford (Anthony Hopkins), updated the hosts' programming to enable their consciousness of what they were and what was happening to them. In the finale, encouraged by Ford (whom Dolores shot dead in an expression of her autonomy), the hosts took over Westworld in a gory coup.
Without spoiling any of the new developments, the second season continues and expands the narratives in play. There's the reality of the uprising playing out and the hosts running the show, which brings to mind the lessons from history that bloody revolutions rarely end well. (A frequently repeated rubric in the show, which seems to incite rebellion among the hosts, is a Romeo and Juliet quote, "These violent delights have violent ends.") There's the overarching question of what it means to be human—to have the capacity for self-awareness and love, but also for cruelty and barbarism. There's the contemporary resonance that comes when Dolores (whom Wood has cited as enabling her own real-life recovery from profound trauma) describes the host insurgence as a "reckoning."
There's also the fact that Westworld has always winked that it's a story about making stories. Westworld the park employs writers to create and script the loops various hosts play out, which you could (charitably) deduce is why the dialogue sometimes seems so flat. A welcome note of comic relief in Season 2 comes from Lee Sizemore (Simon Quarterman), Westworld's narrative director, who finds himself trapped in scenarios that he devised. ("You try writing 300 stories in three weeks," he grumbles at one point when his originality is questioned.)
The meta-ness doesn't stop there. Season 1 of Westworld critiqued the way purveyors of entertainment stoke baser human desires for graphic sex and violence … while serving up graphic sex and violence within the framework of a popular television show. Season 2 does the same with cultural stereotypes, via the Ghost Nation tribe that terrorizes Westworld, and with a new theme park owned by the same company. Is Shogun World (one of six parks operated by Delos) making a broader point about the samurai and geisha clichés embedded in Western ideas of Japanese culture? Or is it simply trafficking in the same imagery?
In short, there's a lot to think about. Deciphering the show's puzzles can be thrilling, as fans who'd long suspected the Man in Black (Ed Harris) was an older version of innocent William (Jimmi Simpson) discovered. But Westworld's complexity can also feel like an orchestra in which every musician is playing different compositions simultaneously. You long for some harmony, or at least the opportunity to focus on one melody at a time. Jonathan Nolan and Lisa Joy, who created the series, seem to have a masterful control over where their narrative is going, which makes it easier to trust that the multiple arcs will eventually align. The question of what to make of it all is less conclusive.
Still, Season 2 of Westworld is always absorbing, and more dynamic in its pacing than Season 1. It's also graced with some tremendous performers, notably Wood, Newton, and Jeffrey Wright doing double duty as Bernard (a robot in the present moment) and Arnold (Westworld's co-founder, seen in flashbacks). Bernard, who only recently discovered that he wasn't human, is the closest thing viewers get to a guide through Westworld's thorny terrain. He's more than a host, more than a mortal. And, for the first half of the new season, at least, he seems completely perplexed as to what's going on. "How did all these disparate threads come together to create this nightmare?" a character asks Bernard in one episode. Here's hoping everyone gets to find out.
Posted: 22 Apr 2018 06:54 AM PDT
A week ago I quoted an unnamed "reader in New Haven," who offered thoughts about "The Future of Elite Schools in the Trump Era." That occasioned a lot of response, which is still coming in. I quoted some of it in "Trump vs. Harvard and Yale" and "The Future of Elite Schools, Continued."
This next installment comes from the author of the original message, who is now willing to be identified. He is Michael Doolittle. As he explains, he is a Harvard College alumnus, and he works as a photographer in New Haven. In the message below he talks about the under-publicized but important role of sports in elite-college admissions. As he says an introductory note:
Now, Doolittle's response to those who have read and reacted to his original message. By the way, the photos in this post are by him, of scenes at Yale:
JF note: By chance I know Michael Doolittle's parents and once worked closely with his father. Back to his message:
Posted: 22 Apr 2018 05:00 AM PDT
The 2019 Subaru Ascent will have 19 of them. Not airbags, but cupholders. That's more than any mass-market vehicle ever produced, amounting to almost two-and-a-half cupholders for each passenger. There's room for a Starbucks skinny latte, an unnaturally colored Big Gulp, a Yeti Rambler, and juice boxes galore. So many cupholders, in fact, that The Wall Street Journal recently declared: "We are approaching peak cupholder."
Although it might be hard to imagine now, eating and drinking in cars was once next to impossible. Beyond a quick swig from a flask, rough roads and a lack of power steering and advanced suspension systems made it difficult and unpleasant to eat or drink on the road.
Cupholders began as an afterthought, mere circular indents on the inside of the door of the glove compartment, but they have become an absolute necessity and a key feature that shoppers evaluate when purchasing a new car, even for a time supplanting fuel efficiency as a consumer's most sought-after attribute.
Cars like the Model T had accessories for eating, but they were intended for use when the car was parked. In 1936, for example, E. B. White waxed poetic in The New Yorker about the number of pages in the Sears catalogue once devoted to pimping out a Model T. Among the various items available for sale in those pages were small kitchenettes packed into trunks and attached to the sideboards of cars. They came complete with iceboxes and storage bins for pantry items like flour and sugar and a small foldout table. They were most likely used in auto camps, where early motorists stopped to enjoy the scenery, relax, and refuel.
On the luxury end of the spectrum, cars like the Rolls Royce came equipped with elaborate, monogrammed picnic baskets complete with silver utensils. F. Scott Fitzgerald's description of Gatsby's Rolls was characteristic: "It was a rich cream color, bright with nickel, swollen here and there in its monstrous length with triumphant hatboxes and supper-boxes and toolboxes." But these lavish items were most definitely meant for a stylish roadside picnic, not for eating while motoring.
It didn't become fashionable to eat in cars until the 1950s, when drive-in restaurants became popular. Waitresses on roller skates, called carhops, delivered milkshakes and burgers on trays whose crooked arms hooked over half-rolled-down car windows. Meals were enjoyed inside the car and the driver handed drinks to passengers who either nestled them between their legs or set them perilously on the floor.
Soon, however, more adventuresome travelers took advantage of increasingly better roads and smoother transmissions, adapting their automobiles so that it was possible to have a snack in a moving car. The November 1950 issue of Popular Mechanics shows a photo of a small snack tray that hangs from two cords attached to the dash with suction cups. It mentions that snacks can be enjoyed while moving and the tray has room for two small bottles of soda. The tray was designed to be stored in the glove compartment when not in use.
Three years later, in 1953, a patent for a car cupholder was granted to an inventor in Texas. The drawings look remarkably like the cupholders of today. They portray either a singular cylinder to hold a cup or a small console wedged between two seats in the back with two round holes for drinks. Still, it would take years for auto manufacturers to warm to these designs; the cupholder would continue to be an afterthought for manufacturers for two more decades.
Until then, production cupholders were prototypical at best—more like suggestions for a spot to put a drink than places to secure one. They were likely used for holding drinks purchased at newly popular drive-through restaurants like McDonald's. And it seems clear from the precarious nature of their design that the car would have to be stopped while eating and drinking for these early cupholders to serve a purpose.
Things began to change during the late 1960s and 1970s. The suburbs grew, and the idea of sleeping in one community and working in another gave birth to the modern commute. The car, which had originally liberated rural folks from a social life confined to front porches and front parlors, became more than just a conveyance. In addition, it transformed into a place in its own right.
Manufacturers might have been slow to warm to the idea of cupholders in cars, but car owners were not. They first began to haul drinks via after-market add-ons. "The first widely available true cupholders," writes the Duke University engineering and history professor Henry Petroski, "were holster-like plastic ones" that attached to the window.
Petroski says that these cupholders became popular just as the pop-top can replaced soda bottles in the mid-1960s. He describes them as having a "thin, flat, hooked extension that was inserted between the window and the inside door panel, squeezed in between the glass and the then commonly used, feltlike material that kept car windows from rattling."
Seeing the popularity of the plastic cupholders, manufacturers adopted them as part of a new overall interior design starting in the mid-1980s. Chrysler reportedly put the first cupholders in mass-market vehicles in their popular 1984 Plymouth Voyager minivan. They were small depressions in the center consoles of the vans, intended to support a 12-ounce cup of coffee.
Minivans would eventually come to symbolize modern motherhood and become emblematic of a harried woman trying to do it all—suited up for work and still making it to soccer practice at the end of the day, feeding the brood in the back with sodas and snacks, all the while stoking herself with caffeine as she rushed through her day. And that's likely the real origin of the cupholder as we know it—when the minivan became a living room, dining room, and study hall all in one, and the cupholder became a necessity more than a convenience.
In 2007, PricewaterhouseCoopers reported that, according to their surveys, the number of cupholders in a car was a more important factor for consumers purchasing a car than fuel efficiency. With so many vehicle models to choose from, drivers seem willing to allow the placement and quantity of cupholders to drive their purchasing decisions, at least in part. "Cupholders complete an interior," one gearhead writes, "sometimes taking the whole car along for the ride."
Meanwhile, the ride is getting longer and longer. Today, the average American spends about 50 minutes commuting each day. Eating in cars is so common that some models come equipped with built-in vacuums to clean up the crumbs. Food and drinks ingested in cars have become such an important category that fast-food companies now test their food items for spillability and leakage—what they call "one-handed convenience."
A recent, successful Kickstarter campaign ups the ante by featuring the Saucemoto—an in-car dip clip for ketchup and dipping sauces—essentially a small ramekin holder designed especially to accompany chicken nuggets and french fries. More than 3,000 backers pledged over $60,000 for a chance at an early model of this design.
Long commutes and active, harried schedules contribute most to the rise of the cupholder. But one academic links the desire to travel with warm beverages back to humans' earliest needs for warmth and succor. G. Clotaire Rapaille, a French-born cultural anthropologist, claims that sipping a warm liquid while speeding down the highway is an act akin to reaching for a mother's breast. It is, Rapaille says, a necessary component for our view of the car as safe. "What was the key element of safety when you were a child?" he asks. "It was that your mother fed you, and there was warm liquid. That's why cupholders are absolutely crucial."
This must have been news to European auto manufacturers. Although thoroughly familiar with the appeal of an espresso, continental designers long resisted placing cupholders into their vehicles, relenting only when U.S. sales started to suffer. "For years, Mercedes was convinced we should teach Americans to drink their coffee at home," Daimler AG's CEO, Dieter Zetsche, told the Wall Street Journal. "Obviously, that didn't work out so well." To compete, European and Japanese carmakers studied the size and shape of U.S. beverages, even going as far as shipping empty containers back to headquarters, or 3D-printing models of Big Gulp cups.
It's a complicated design problem. Various drink sizes must be accommodated, from thin Red Bull cans to giant fountain-drink cups to square juice boxes. The cupholders themselves must be located where there are already multiple competing demands for space—from heating controls to GPS screens to places to hold and charge phones. The design challenge is so great, and the puzzle-like dimensions of the interior so difficult, that some designers claim that the cupholders are among the first things to be situated when it comes to new interior designs.
Even a visionary like Elon Musk can fail to grasp the importance of the cupholder. Despite the enormous advances in technology and sustainability that the Tesla offers drivers, the company was faced with intense criticism because its original design for the Model S had no rear cupholders.
In the case of Tesla, an after-market industry arose, with companies offering LED-lighted cupholders that could be inserted into what some customers thought was a deficient rear console. Tesla responded rapidly and its cupholders in the Model X are now artful, curved, and customizable for any drink size.
In the decades since they were formally ensconced in the minivan, cupholders have become ubiquitous. In 1999, for example, ABC's Nightline asked the Palo Alto design firm IDEO to redesign the supermarket shopping cart—and that design somewhat famously included a cupholder for busy shoppers.
In addition to grocery carts, cupholders are now common on riding mowers, baby strollers, and the large institutional floor scrubbers used by nighttime cleaning crews in hospitals and airports. Everything must also offer a place to put a beverage.
If driverless cars become a reality, they might afford very different interiors. Some have imagined future autonomous vehicles as little offices or conference rooms. One design for a driverless car even has bank seats and a lounge-like atmosphere. Petroski imagines that in the future a cupholder might "move under a cup being put down by a driver watching the road the way an outfielder moves under a fly ball." Or, he says, "truly visionary drivers might even fantasize of the robot cupholder that can move a cup into a hand groping in the dark." Ford envisions a more active future for the automobile, self-driving or not. Last year, the company filed a patent for a gyroscopic cupholder that aims to keep a drink upright even while the vehicle charges up steep terrain.
But some auto manufacturers preparing for the convergence of two important trends—ride sharing and driverless cars—have a decidedly less rosy view. In 2017, Investor's Business Daily reported that manufacturers are reconsidering automotive design for the novel conditions of autonomy, and for services like Uber and Lyft. Among other things, their interiors will be smell-resistant and puke-proof. That's a far cry from Gatsby's romantic Rolls, or even the adventurous camper in a Model T, let alone the car interior as succor akin to a mother's breast.
The car "moves on, but not on our lines," wrote the English novelist E. M. Forster, who was astonished by the automobile's power to shape human behaviors and therefore society. "[It] proceeds," he wrote. "But not to our goals."
After all, it was never a conscious choice to prefer a hot coffee and burrito slurped down at 60 miles per hour to a coffee sipped out of a china cup with the day's newspaper at hand and a hot, cooked meal on the kitchen table. The cupholder is a tool that drivers and passengers adore and demand, but without ever considering why, or what alternatives they forgo in obsessing over drinks on the go.
Posted: 22 Apr 2018 09:12 AM PDT
Every once in a while I'll re-watch an old episode of Friends, because it's familiar and soothing and there. The other day, Netflix served up one of those flashbacks the show would sometimes air to poke light fun at the friends and at the visual absurdities involved with being alive in the '80s: Rachel in chintz, Ross and Chandler in tragicomic Flock of Seagulls bouffants, etc. Watching the meta-nostalgia, I was reminded of the existence of a minor character who nonetheless plays a major role in the show's universe: Fat Monica.
Fat Monica is technically just a younger—and slightly larger—version of Standard-Issue Monica; what becomes wincingly clear, though, as the Friends flashbacks play out, is that Fat Monica differs from the other Monica not just in scale, but in kind. Padded by her former girth, Monica Geller—the person who categorizes her hand towels and designates committees for the planning of birthday parties and is, in general, in thorough control of her life and her Type-A-tastic self—undergoes a transformation: Her voice gets higher. Her movements become jerking and awkward. She giggles a lot, uncomfortably. Remember when, in those late-series episodes of Family Matters, Steve Urkel would go into that flashing box and emerge as the suave Stefan Urquelle? Fat Monica's metamorphosis is a little like that, but in reverse: The transformation depletes her dignity rather than compounding it. She becomes bashful. Childish. Foolish. Watching the proceedings, you start to wonder whether Monica Geller, for the purposes of the flashback scenes, was given a fat suit or a lobotomy.
The Fat Monica thing is an easy joke—which is to say, it is a lazy joke—but it doubles, as lazy jokes so often do, as an insight. When Friends, looking for reliable lols, put the skinny-even-by-Hollywood-standards Courteney Cox into cheek-jowls and body-lumps—and, then, when it proceeded to suggest that the physical change would alter Monica's very personality—the show neatly channeled the way American culture itself treats fatness, by default: as a flaw not just of appearance, but of character. As an aesthetic failing that doubles as a moral one.
Friends may have arrived onto the scene in the years before "body positivity" would pervade magazines and blogs and Instagram, before Dove would attempt to reclaim the pear shape by turning it into bottles of body wash, before "empowerment" would be reduced to a chipper marketing slogan. The show anticipated the current moment, though, in its inability to imagine that a fat Monica Geller could be, fundamentally, the same person as a thin Monica Geller. Every cameo Fat Monica makes on the show—every lurching dance she does, while eating a slice of pizza, as the studio audience guffaws at the sight—is a starkly literal realization of one of the few insults that has, over the decades, retained its capacity to degrade: She's really let herself go.
In 1990, four years before Friends premiered on NBC, Naomi Wolf published The Beauty Myth, her examination—and her indictment—of the way attractiveness functions as both a metaphor for and a mandate over women's lives. The book now has a sequel, of sorts: Heather Widdows, a professor of philosophy at the University of Birmingham in England, will soon be publishing Perfect Me: Beauty as an Ethical Ideal. The book, a scholarly work that is urgently relevant to the current cultural moment, is definitely not about Fat Monica; in another way, though, it is deeply about Fat Monica. It is an expansive inquiry into the treatment of one's appearance as a ratification of one's character. "As a value framework, the beauty ideal provides shared standards by which to apportion praise, blame, and reward," Widdows writes, "making beauty-success a moral virtue and beauty-failure a moral vice."
In American life, the beauty ideal both Wolf and Widdows are taking to task has adopted a this-is-water kind of status: Its demands—made not exclusively of women, but made most directly of women—are so thoroughly infused into our commercial culture that to talk about them at all can seem, if not hopelessly naive, then thoroughly redundant. And when we do talk about them, the words we're left with tend to reflect an acquiescence to beauty's power: Attractiveness—and it is revealing, of course, that this consummately subjective word has come to suggest a kind of objective truth—is often discussed using the sweeping language of moral virtue. The word "beautiful" shares a root with bene, the Latin for good, and the ancient etymology is summoned every time thigh gaps are treated as evidence of self-control, every time clear skin is assumed to be a manifestation of a calm mind, every time L'Oreal chides women to choose its brand of wrinkle elixir—to help erase the visible evidence of smiles and sun and life itself—"because you're worth it."
Here is the logic of the prosperity gospel, essentially, applied not merely to the quality of one's possessions, but also to the quality of one's appearance. The Americans of 2018 have at their disposal, arguably, more ways than ever before to control their personal levels of attractiveness, from makeup to Spanx to exercise regimens to hair dyes to nail polish to retinols to the services plastic surgeons carefully euphemize as "procedures." Those things can have positive effects (makeup can be a means of self-expression; skincare can be a communal exploration). But they have also raised the stakes. Not only do they reaffirm the notion that beauty can be bought—that it is a matter of class privilege—but they also, steadily, transform the meaning of beauty itself: from a matter of luck, an accident of atomic arrangement, to the product of dedicated labor. Beauty, in that frame, becomes a commentary on one's work ethic. And, indeed—Fat Monica may have been a joke, but she understood the order of things—on one's character.
Which means that it's perhaps easier than ever, if also as unjust as ever, to blame the person who fails to live up to the narrow standard—particularly, as Fat Monica was also acutely aware, when it comes to weight. It's definitely easier to do that kind of blaming than it is to question the standard that demands the conformity in the first place. The prosperity gospel is ruthlessly efficient in its judgments. Maybe if she'd just work a little harder. I'm not being superficial; I'm just thinking about her health. Beauty is truth, truth beauty; that is all ye know on earth, and all ye need to know.
While beauty as an ethic is omnipresent—in ads, in music, in the TV shows of the mid-to-late 1990s—its logic has re-entered the conversation more directly in recent days because of the premiere of I Feel Pretty, the Amy Schumer vehicle that doubles as Hollywood's latest work of faux feminism. Directed by Abby Kohn and Marc Silverstein, who previously wrote the screenplays for, among other rom-coms, How to Be Single and He's Just Not That Into You, I Feel Pretty has been sold as a "'body-positive' film." In that marketing it has had assistance from a strident hashtag—#FeelPretty, it commands—and from Cosmopolitan magazine, which has been using the film as an opportunity to talk about the merits of that quintessentially modern aspiration: self-confidence.
The pitches are accurate to the extent that the upshot of I Feel Pretty, if you don't mind the spoiler, is: Have, if you possibly can, self-confidence. The film concerns Renee, a charming but sad young woman whose fondest wish is to know what it feels like to be beautiful (which, in the film's mind, seems only to mean "skinny"). Through a turn of events that evokes the transformative magic of Big and 13 Going on 30 and Freaky Friday and What Women Want—Pygmalion, basically, but without all the effort—Renee gets her wish. Sort of. At a SoulCycle class, she falls off her bike and hits her head. When she awakens from her enchanted slumber in the SoulCycle locker room, she is beautiful.
Or, well—here is the "com" element of this particular rom-com—she thinks she is beautiful.
I Feel Pretty is in some ways a slightly more self-aware inversion of Shallow Hal, the 2001 Jack Black/Gwyneth Paltrow comedy, which similarly attempted to give cinematic credence to the truism that It's What's Inside That Counts. Paltrow, in the Farrelly Brothers' film, is Rosemary, a very good person who occupies what the film sees as a very bad body; Black is Hal, a hopelessly superficial guy who—because of a spell cast on him early in the movie by one, yes, Tony Robbins—comes to see people's inner goodness manifested in their appearances.
Watch, then—and laugh, allegedly—as Hal meets Rosemary and proceeds to operate under the comically mistaken impression that his new girlfriend is approximately as gorgeous as Gwyneth Paltrow. Behold the jokes about the pair canoeing together as Hal, on the stern, is hoisted into the air by the heft of his bow-based boatmate; and as the two get into a car together and her side promptly crunches down; and as Hal returns to the enormous diner milkshake the two are meant to share, only to find the glass already empty, sucked down by his date with an almost mechanical efficiency. Watch the hilarity that ensues when the Farrellys serve up the extremely predictable jokes about Rosemary, pure of heart but sullied of form, sitting on chairs and benches and immediately breaking them.
There's a revealing doubleness to Shallow Hal—and one that has very little to do with the True Beauty stuff the Farrellys tried so hard to telegraph in their rom-com. Shallow Hal is thoroughly beset by the unflinching and omnipresent aesthetic demands Heather Widdows is describing in her book: It is the ethical ideal of beauty made manifest. The film claims to be challenging superficial and constraining standards of (women's) appearances; in the end, of course, it endorses those very standards. It talks about real beauty as the stuff of kindness and goodness and love, but it can't bring itself, in the end, to believe its own easy message. I know that because Shallow Hal, in spite of it all, is a feature-length series of fat jokes.
Nearly two decades later, I Feel Pretty has served up a similarly revealing mixture of aspiration and acquiescence. It wants, so much, to be better than it is. It truly believes it is better than it is. The film, having shifted its gaze from the male to the self, says one thing—Be confident in yourselves, ladies! Empowerment! SoulCycle!—but it cannot summon the courage to believe its own platitudes. The butt of the movie's running joke (and I do mean that literally, since I Feel Pretty has great fun splitting its heroine's yoga pants down the back and otherwise exposing her flesh to a mocking world) is Renee's tragicomic misunderstanding of her body as an object. "Yes, modeling is an option for me," she says during a job interview, breezily, while the film helpfully offers a beat so its audience can laugh at the absurdity of her delusion.
The audience, similarly, is meant to cringe in pre-emptive horror when Renee applies for a receptionist job usually reserved for aspiring models. And to laugh knowingly when, while scarfing her lunch (carbs!), she informs her model-thin coworkers, "I can eat whatever I want and still look like this." And to wince through the guffaws when Renee enters a bikini contest—a turn of events that the film portrays in cheeky slo-mo, as if the sight of its heroine's belly, disguised by neither clothing nor shame, is a joke all its own.
Some of this is mitigated by the fact that I Feel Pretty really seems to believe that Renee, as a person, is just as fantastic as she comes to believe she is; she is, after all, funny and quirky and hard-working and smart (and, by the way, totally crushes that receptionist job). It's "not about an ugly troll becoming beautiful," Schumer told Vulture, defending the movie; "it's about a woman who has low self-esteem finding some … . Everyone's got a right to feel that feeling, regardless of their appearance."
But film is a visual medium, and I Feel Pretty can't find a way, in the end—nor does it seem, for all its ambitions, to have looked terribly hard—to offer a critique of the impossible beauty standards that doesn't also capitulate to the impossible beauty standards. It acknowledges, sure, that pretty people have their own self-image problems; it fashions itself, definitely, as a rom-com that is—wait for the twist!—also about Renee's romance with herself; it concludes with an uplifting message about the merits of self-confidence. But that ultimate embrace of self-esteem, I Feel Pretty's answer to a chase-the-love-interest-down-at-the-airport finale, comes in the middle of Renee's pitch for a new line of … makeup. Nathan Poe did not, to my knowledge, extend his theory of parody to social commentary, but the logic applies here nonetheless: I Feel Pretty tries to make a joke at the expense of superficial notions of beauty. It gives those ideas, though, the last laugh.
So here, again, as so often happens in works of commercialized feminism, is the person questioned while the system she's caught in remains intact and assumed and inevitable. Here is beauty, still, treated by easy default as the axis around which so many lives must spin. Here is that convenient catch-all, "self-esteem," portrayed as both the corrective to the beauty myth and the evidence of its continued power. I Feel Pretty, in all that, comes to feel distinctly petty: Love yourself, despite your flaws! the movie cheers. And then it whispers: Remember that they're really massive flaws.
There's an air of soft defeatism that permeates these proceedings—one that reads as a referendum not just on a well-intentioned-but-deeply-flawed film, but also on the stories we tell ourselves in order to live. The film's confusion, after all, is America's confusion. Renee is Fat Monica, chomping on a donut while dancing. Renee is Shallow Hal's Rosemary, tipping the canoe. Renee is a walking, talking, bikini-contest-entering reminder that self-confidence, as long as the world around it insists on equating physical beauty with moral achievement, will be laughably insufficient.
Today's culture is one that treats "wellness" itself as a matter of economic privilege (and one in which the star of Shallow Hal would like to sell you some $36 coconut oil from a brand named Skinny & Co). In that landscape, ideals of beauty, which have so long been targeted by gender and race and class—which have so long been weaponized—become even more insidious. When I recently spoke with Heather Widdows, the author of Perfect Me, she put it this way: "The fact that we think it's normal to be dissatisfied with one's body in some way—I mean, that says an awful lot. How did we get to the point where that's not regarded as odd?"
It's an extremely good question. Part of the answer, as Naomi Wolf suggested, is that the capitalistic enterprises that shape American culture have a deeply vested interest in keeping the public insecure—always looking for self-improvement, always looking to fix what isn't broken, always looking for a bit of modern magic. But part of the answer, too, is in the very things that are meant, in theory, to transcend the vagaries of quotidian concerns: our art. Our leisure. American popular culture still insists, product by product, on the truth of the beauty myth.
The myth is there, masquerading as fact, on Jessica Jones, when the show's protagonist comments, disgust in her voice, on a woman who has stopped exercising to eat a doughnut. And on Master of None, when Dev and Rachel engage in a joking discussion about the most polite thing to call a fat person. And on 30 Rock, when Jenna's weight gain finds her employing the demeaning catchphrase "Me Want Fooooood." And on Glee, when Quinn's competition for prom queen reveals that the cheerleader had once been an overweight 8th-grader nicknamed "Lucy Caboosey." And on American Housewife, which dedicated its pilot episode to fat jokes made by the titular character—about herself. And on How I Met Your Mother, when the show's writers, for a brief arc, put Barney—vain, clothes-obsessed Barney—in a fat suit. And on New Girl, which imagines Fat Schmidt, the show's answer to Friends's flashback figure, as someone who is a little bit silly. And a little bit sad.
Friends ended its run 14 years ago; Fat Monica, however, remains. Not just as an inspiration for other sitcomic characters, and not just as an occasional appearance on a Netflix screen or a basic-cable station near you, but also as a specter. As a joke. As a warning. "I called you fat?" Chandler says, when he is reminded that, in college, he made an off-handed remark about Ross's "fat sister"—and when he learns that Monica had overheard him making the comment. With that, Chandler Bing, the human embodiment of the mordancy of the '90s—could he be more sarcastic?—proceeds to express the most sincere regret he will ever demonstrate over 10 seasons of Friends. "I'm so, so sorry," he tells the no-longer-Fat Monica. And he means it. He is thoroughly chastened. He called her fat, after all; and he can't imagine—nor can his TV show imagine, on his behalf—a more terrible insult.
|You are subscribed to email updates from The Atlantic. |
To stop receiving these emails, you may unsubscribe now.
|Email delivery powered by Google|
|Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States|