I must introduce a parenthetical protest against the abuse of the current term 'social justice.' From meaning 'justice in relations between groups or classes,' it may slip into meaning a particular assumption as to what these relations should be; and a course of action might be supported because it represented the aim of 'social justice,' which from the point of view of 'justice' was not just. The term 'social justice' is in danger of losing its rational content--which would be replaced by a powerful emotional charge. I believe that I have used the term myself: it should never be employed unless the user is prepared to define clearly what social justice means to him, and why he thinks it just.
Dimitrios Halikias' amateur ruminations on philosophy, politics, and history. "How small of all that human hearts endure, that part which laws or kings can cause or cure" - Samuel Johnson. Contact me at dimitrios.halikias@gmail.com
Sunday, August 28, 2016
T.S. Eliot on "Social Justice"
T.S. Eliot in a footnote in "Notes Towards the Definition of Culture"
Friday, August 26, 2016
Our Culture of Obsessive Nostalgia; a Guest Post by Rich Lizardo
Rich Lizardo is a Ph.D. student in history at the University of Pennsylvania, where he studies early-modern Spain. An article he wrote for National Review Online was cited in an earlier Omphaloskepsis blog post. Contact him at rflizardo@gmail.com.
With a Clinton running for president, Donald Trump aiming to “Make America Great Again,” and PC culture making a noxious comeback, countless commentators have appropriately noted the “politics of nostalgia.” Especially among whites, data of many sorts suggest a longing for the past—a past where “life for people like [them]” was better, where their family ties and civic commitments and religious communities were stronger, where their future economic prospects seemed brighter. My friend Dimitri Halikias is writing a series of reviews on a recent book by political scientist Yuval Levin discussing this very topic.
But despite a rise in the collective awareness of this “politics of nostalgia,” discussion of the much broader culture of nostalgia has been, with a few notable exceptions, quite absent. In at least the last five years, almost all media of artistic expression have seen an obsession with nostalgia driving much of our present cultural formation (well, re-formation) and engagement.
The most egregious culprit producing this hypernostalgia is our film industry. At my local movie theater, there are nine films currently on screen: among them, Alvin and the Chipmunks: The Road Chip, the fourth film installment since 2007 about a group dating back to the ’50s; Ben-Hur, a remake of the classic film from 1959; Jason Bourne, the fifth film of the franchise based on the 1980 novel; Mechanic: Resurrection, a sequel to a film in 2011 that was itself a remake of a film from 1972; Pete’s Dragon, Disney’s remake of its own film of the same name in 1977; Suicide Squad, which features a host of DC Comics characters we’ve seen for over half a century; and Star Trek Beyond, which, well, you get the point. (If I went to my theater and wanted to watch a film with an original script, my sole options would be some horror flick and Sausage Party.) Also currently playing in other theaters: sequels of The Purge and Ice Age and remakes of The BFG, Ghostbusters, and Tarzan.
And that’s just what’s in theaters right now in August 2016. As New York Times columnist Ross Douthat put it back in January, “The biggest blockbuster of 2015 was about . . . Darth Vader’s grandchildren. It is directed by a filmmaker who’s coming off rebooting . . . Star Trek. And the wider cinematic landscape is defined by . . . the recycling of comic-book properties developed between the 1940s and the 1970s.” Coming soon: more Jedis, more Avengers, and probably at least one more James Bond.
Even on TV—a medium that, to be fair, has continued to produce original, mass-market material in this “Golden Age”—there’s apparently a market for Fuller House, which earlier this spring was renewed for a second season. Jimmy Fallon hosted the Saved by the Bell cast to act in a skit on his show last year, using a recreated set of Bayside High; and the cast of Friends held a reunion at the start of the year.
To prove our hypernostalgia in television isn’t limited to just the ’90s, the show 24, after eight relatively successful (and solid) seasons from 2001 to 2010, rebooted and released a ninth season a full four years later in 2014; a spinoff, titled 24: Legacy, is scheduled to premiere in 2017. The producers of the best show in the last decade, Breaking Bad, just couldn’t help themselves when the show finished: Less than two years later, they put out the spinoff, Better Call Saul. And The Simpsons, while not a reboot, a remake, a spinoff, or a sequel, is still on air—beginning its 28th season(!) in a few short weeks.
This unfortunate phenomenon also affects the books we read, the musicals we watch, the games we play, and even the clothes we wear. According to Publishers Weekly, the number-one bestselling work of fiction in 2015 was Go Set a Watchman, Harper Lee’s prequel to To Kill a Mockingbird, which itself made the list at number seven. Also in the top 20 bestsellers of 2015: The Great Gatsby, Fahrenheit 451, and The Alchemist. As of this moment, on Barnes & Noble’s list of the top 100 bestsellers of 2016, Harry Potter and the Cursed Child tops the list. Moreover, in the last year, the musical that has swept the nation—more so than any musical in my lifetime—features the story of a famous Founding Father we’ve presumably all learned about in grade school. Hamilton obviously takes a shot at building upon our history and delivers it with a wonderful twist, but it’s nevertheless an old story with an old character. And suddenly in 2016, it became okay for folks in their 20s, 30s, and 40s to walk around with their faces dug into their phones as they swiped up to catch Pokémon. Yes, Charizard has been the object of my fascination and affection—but that was when I was 10, playing the Red and Gold versions on my Game Boy Color and trading cards (against the school rules) with my friends—other 10-year-olds.
Facebook, Twitter, and the rest are no help. They do more than just promote all these media; they actively cultivate this disposition of hypernostalgia in each of us individually. They’ve developed hashtags—most notably, #tbt (short for throwback-Thursday) and #fbf (short for flashback-Friday), both of which are often not even used on their specified days—to encourage us to share something from our pasts with others and then feel rewarded through likes, retweets, and so forth. And in the last year or so, Facebook has taken upon itself to show you, the individual user, a “memory” from your own past that got lots of likes so that you can share it again and get even more likes.
Even in clothing, what’s vintage and what’s retro is often what’s in: dresses, suits, glasses, shoes, the works. Pick up the latest style magazine, and you’ll inevitably see something making a comeback. With no effort, I found a post titled “Trucker Hats Are Coming Back,” written just last week on GQ’s website; in it, one finds the following passage:
Nor is any of it necessarily bad. These things give us a connection to the past, a relationship that might be weaker without them. Certainly, adapting old stories to our modern tastes requires its own kind of creativity; indeed, Hamilton, in all its artistic brilliance, stands out. Certainly, even more-recent remakes can be valuable when the stories are introduced to different audiences; House of Cards and The Office, both American remakes of British originals, are perfect examples. Certainly, reading great works of literature is always a good thing. And certainly, on a more individual level, reliving happy moments from our own pasts can even keep us grounded at times, and sharing those moments with others can be a valuable (often necessary) way to reconnect.
But what is new in our cultural moment is the level of obsessive nostalgia we’ve reached, and what is bad are the costs of its extremity. In the article linked to above, Douthat quotes historian Jacques Barzun to argue that this phenomenon is best explained by a “decadence” defined as a “falling off” whereby the “forms of art as of life seem exhausted” and so “repetition and frustration are the intolerable result.”
That description seems quite apt. We’re not just reliving the ’90s; we’ve mostly exhausted all decades available to us: sporting styles from the ’30s (tab and club collars are in; haven’t you heard?), conjuring up characters from the entire century, posting photos from last month with “#fbf.” On a cultural scale, we’re losing the desire to create new stories, we’re losing the ability to let go of the characters within them when those stories reach natural endings, and we’re more than ever forcing these stories to live beyond their years. Hollywood doles out ever-more superhero flicks, J. K. Rowling expands her magical world further still, and the Pokémon franchise now promotes an old game through a new app—all for one reason: It all sells. What’s more, what these specific examples all have in common is the development of their specific “universes”: the Marvel Universe, the Wizarding World, the Pokémon Universe. Pixar and many others have given in to this impulse. Even when the stories or characters themselves are original, by placing them in a given “universe,” the creators give a nod to our hypernostalgia, thereby retaining us, their loyal consumers.
So long as our audiences keep demanding these recycled stories, characters, backdrops, and memories, the producers in Hollywood, Broadway, Silicon Valley, and elsewhere will keep spitting them out. There’s nothing wrong with looking fondly upon the past, but keeping our focus there crowds out our own originality, creativity, and ability to move on.
We’ve definitely overstayed our visas in our many trips down memory lane, and it’s time we bring ourselves back to the present state of reality.
_____
*A cultural medium that serves as an exception to this new phenomenon of obsessive nostalgia is music, it seems. For many decades, we’ve had tribute bands, Elvis and MJ impersonators, best-hits albums, and covers upon covers upon covers. I admit that I can’t quite put my finger on it, but I strongly sense that despite these unoriginal elements of music that spread across decades, the lack of originality (for want of a better phrase) has much less to do with any sense of nostalgia compared to these other media I’ve discussed. But check out this book review in the Atlantic of Simon Reynolds’ Retromania: Pop Culture’s Addiction to Its Own Past, which is almost exclusively about music and which explains why I’m wrong.
With a Clinton running for president, Donald Trump aiming to “Make America Great Again,” and PC culture making a noxious comeback, countless commentators have appropriately noted the “politics of nostalgia.” Especially among whites, data of many sorts suggest a longing for the past—a past where “life for people like [them]” was better, where their family ties and civic commitments and religious communities were stronger, where their future economic prospects seemed brighter. My friend Dimitri Halikias is writing a series of reviews on a recent book by political scientist Yuval Levin discussing this very topic.
But despite a rise in the collective awareness of this “politics of nostalgia,” discussion of the much broader culture of nostalgia has been, with a few notable exceptions, quite absent. In at least the last five years, almost all media of artistic expression have seen an obsession with nostalgia driving much of our present cultural formation (well, re-formation) and engagement.
The most egregious culprit producing this hypernostalgia is our film industry. At my local movie theater, there are nine films currently on screen: among them, Alvin and the Chipmunks: The Road Chip, the fourth film installment since 2007 about a group dating back to the ’50s; Ben-Hur, a remake of the classic film from 1959; Jason Bourne, the fifth film of the franchise based on the 1980 novel; Mechanic: Resurrection, a sequel to a film in 2011 that was itself a remake of a film from 1972; Pete’s Dragon, Disney’s remake of its own film of the same name in 1977; Suicide Squad, which features a host of DC Comics characters we’ve seen for over half a century; and Star Trek Beyond, which, well, you get the point. (If I went to my theater and wanted to watch a film with an original script, my sole options would be some horror flick and Sausage Party.) Also currently playing in other theaters: sequels of The Purge and Ice Age and remakes of The BFG, Ghostbusters, and Tarzan.
And that’s just what’s in theaters right now in August 2016. As New York Times columnist Ross Douthat put it back in January, “The biggest blockbuster of 2015 was about . . . Darth Vader’s grandchildren. It is directed by a filmmaker who’s coming off rebooting . . . Star Trek. And the wider cinematic landscape is defined by . . . the recycling of comic-book properties developed between the 1940s and the 1970s.” Coming soon: more Jedis, more Avengers, and probably at least one more James Bond.
Even on TV—a medium that, to be fair, has continued to produce original, mass-market material in this “Golden Age”—there’s apparently a market for Fuller House, which earlier this spring was renewed for a second season. Jimmy Fallon hosted the Saved by the Bell cast to act in a skit on his show last year, using a recreated set of Bayside High; and the cast of Friends held a reunion at the start of the year.
To prove our hypernostalgia in television isn’t limited to just the ’90s, the show 24, after eight relatively successful (and solid) seasons from 2001 to 2010, rebooted and released a ninth season a full four years later in 2014; a spinoff, titled 24: Legacy, is scheduled to premiere in 2017. The producers of the best show in the last decade, Breaking Bad, just couldn’t help themselves when the show finished: Less than two years later, they put out the spinoff, Better Call Saul. And The Simpsons, while not a reboot, a remake, a spinoff, or a sequel, is still on air—beginning its 28th season(!) in a few short weeks.
This unfortunate phenomenon also affects the books we read, the musicals we watch, the games we play, and even the clothes we wear. According to Publishers Weekly, the number-one bestselling work of fiction in 2015 was Go Set a Watchman, Harper Lee’s prequel to To Kill a Mockingbird, which itself made the list at number seven. Also in the top 20 bestsellers of 2015: The Great Gatsby, Fahrenheit 451, and The Alchemist. As of this moment, on Barnes & Noble’s list of the top 100 bestsellers of 2016, Harry Potter and the Cursed Child tops the list. Moreover, in the last year, the musical that has swept the nation—more so than any musical in my lifetime—features the story of a famous Founding Father we’ve presumably all learned about in grade school. Hamilton obviously takes a shot at building upon our history and delivers it with a wonderful twist, but it’s nevertheless an old story with an old character. And suddenly in 2016, it became okay for folks in their 20s, 30s, and 40s to walk around with their faces dug into their phones as they swiped up to catch Pokémon. Yes, Charizard has been the object of my fascination and affection—but that was when I was 10, playing the Red and Gold versions on my Game Boy Color and trading cards (against the school rules) with my friends—other 10-year-olds.
Facebook, Twitter, and the rest are no help. They do more than just promote all these media; they actively cultivate this disposition of hypernostalgia in each of us individually. They’ve developed hashtags—most notably, #tbt (short for throwback-Thursday) and #fbf (short for flashback-Friday), both of which are often not even used on their specified days—to encourage us to share something from our pasts with others and then feel rewarded through likes, retweets, and so forth. And in the last year or so, Facebook has taken upon itself to show you, the individual user, a “memory” from your own past that got lots of likes so that you can share it again and get even more likes.
Even in clothing, what’s vintage and what’s retro is often what’s in: dresses, suits, glasses, shoes, the works. Pick up the latest style magazine, and you’ll inevitably see something making a comeback. With no effort, I found a post titled “Trucker Hats Are Coming Back,” written just last week on GQ’s website; in it, one finds the following passage:
With looser-cut ’90s-throwback pants making a return, ’80s cars on the rise, and even square-toed shoes inching their way into the current zeitgeist, could it really be that long before the early aughts [2000s] are due for a re-up? Are Ed Hardy graphics and tracksuits next? Wait . . . it’s already happening.Reboots, remakes, spinoffs, sequels, continuations, comebacks, flashbacks, throwbacks: None of this is new. To give just one example, Twelve Angry Men was originally written as a play in 1954, was made into a truly fantastic film a few years later, and was remade 40 years later in 1997 in (and with) color.* As one writer argues, our society tends to produce nostalgically in 20-year cycles: reliving the ’60s in the ’80s, the ’70s in the ’90s, the ’80s in the 2000s, and now the ’90s in the 2010s.
Nor is any of it necessarily bad. These things give us a connection to the past, a relationship that might be weaker without them. Certainly, adapting old stories to our modern tastes requires its own kind of creativity; indeed, Hamilton, in all its artistic brilliance, stands out. Certainly, even more-recent remakes can be valuable when the stories are introduced to different audiences; House of Cards and The Office, both American remakes of British originals, are perfect examples. Certainly, reading great works of literature is always a good thing. And certainly, on a more individual level, reliving happy moments from our own pasts can even keep us grounded at times, and sharing those moments with others can be a valuable (often necessary) way to reconnect.
But what is new in our cultural moment is the level of obsessive nostalgia we’ve reached, and what is bad are the costs of its extremity. In the article linked to above, Douthat quotes historian Jacques Barzun to argue that this phenomenon is best explained by a “decadence” defined as a “falling off” whereby the “forms of art as of life seem exhausted” and so “repetition and frustration are the intolerable result.”
That description seems quite apt. We’re not just reliving the ’90s; we’ve mostly exhausted all decades available to us: sporting styles from the ’30s (tab and club collars are in; haven’t you heard?), conjuring up characters from the entire century, posting photos from last month with “#fbf.” On a cultural scale, we’re losing the desire to create new stories, we’re losing the ability to let go of the characters within them when those stories reach natural endings, and we’re more than ever forcing these stories to live beyond their years. Hollywood doles out ever-more superhero flicks, J. K. Rowling expands her magical world further still, and the Pokémon franchise now promotes an old game through a new app—all for one reason: It all sells. What’s more, what these specific examples all have in common is the development of their specific “universes”: the Marvel Universe, the Wizarding World, the Pokémon Universe. Pixar and many others have given in to this impulse. Even when the stories or characters themselves are original, by placing them in a given “universe,” the creators give a nod to our hypernostalgia, thereby retaining us, their loyal consumers.
So long as our audiences keep demanding these recycled stories, characters, backdrops, and memories, the producers in Hollywood, Broadway, Silicon Valley, and elsewhere will keep spitting them out. There’s nothing wrong with looking fondly upon the past, but keeping our focus there crowds out our own originality, creativity, and ability to move on.
We’ve definitely overstayed our visas in our many trips down memory lane, and it’s time we bring ourselves back to the present state of reality.
_____
*A cultural medium that serves as an exception to this new phenomenon of obsessive nostalgia is music, it seems. For many decades, we’ve had tribute bands, Elvis and MJ impersonators, best-hits albums, and covers upon covers upon covers. I admit that I can’t quite put my finger on it, but I strongly sense that despite these unoriginal elements of music that spread across decades, the lack of originality (for want of a better phrase) has much less to do with any sense of nostalgia compared to these other media I’ve discussed. But check out this book review in the Atlantic of Simon Reynolds’ Retromania: Pop Culture’s Addiction to Its Own Past, which is almost exclusively about music and which explains why I’m wrong.
Thursday, August 18, 2016
"Fractured Republic" Reflections Part One: Corporatization and the New Mass Economy
This is part one of a three-part review of Yuval Levin's The Fractured Republic. See also the introduction and parts two and three of this review.
Yet surprisingly, a brief discussion of unionization notwithstanding, he spends little time considering this model alongside broader developments in the American economy. One gets the impression from Levin’s chapter on the economy that the age of the large, centralized firm is over, and that economic life today is centered around small, diffuse private companies.
Yet Gallup research shows that small business start-ups have been in steep decline over the past forty years, while the share of private sector workers employed in small businesses relative to large businesses continues to fall. And though I don’t have empirical evidence on hand that this is the case, I would conjecture that a great many mom-and-pop stores and restaurants continue to be replaced by ever-expanding industrial chains.
Consider also the rise of national social media networks and technological platforms, which Levin praises as building “inherently narrow and personalized networks” that “build up subcultures rather than a mass culture.” Is that true? Today, a few powerful institutions—Facebook, Google, Twitter, Amazon, etc.—command extraordinary centralizing power. These organizations, largely run by recent post-pubescents, have control over more of our personal information than any totalitarian secret police ever had over their citizenry. That does not strike me as a platform for decentralization. Moreover, it seems wrong to think that these tech forums promote subcultural differentiation. To the contrary, in giving terrifying force to an amorphous “public opinion,” these institutions may well cement and strengthen the insidious, homogenizing tendencies of mass (progressive) culture.
But setting my tech-ludditry aside and returning to more traditional economic institutions, Tocqueville (following Adam Smith) saw in the modern capitalist division of labor the same dialectical pattern that drives democratic polities from individualistic premises to despotic conclusions. In particular, the logic of economic individualism destroys the rich mediating bonds and institutions that formerly humanized social relations. Industrialization further divides society into two distinct classes, united only through on an abstract collection of faceless, market forces.
Through the division of labor, Tocqueville explains, as the “workman is perfected” in productivity, so too is “the man degraded.” Economic specialization constructs a new class of industrial elites who, shorn of any aristocratic duty to their workers, grow far harsher than the most brutal of feudal masters. Just as with subjects under soft despotism, the workman grows “dependent on masters in general, but not on any master in particular.” Connected only through contractual agreement to exchange labor for wages, the master-worker dynamic is rendered entirely economic: “the former makes no promise to protect, the latter no promise to defend, and neither habit nor duty creates a permanent bond between them.”
Particularly damning, Tocqueville insists, is the farcical equality now said to define the master-servant relationship. In the marketplace, each party is bound only in virtue of its consent, and in the formal eyes of the law, the two are equals. Yet this “imaginary equality” hardens the hatred between master and servant. For in the “privacy of his soul, the master still deems himself superior,” but no longer an aristocratic guardian, he abandons those traditional “protective and benevolent sentiments that grow out of a long period of uncontested power.” The servant too, though as a matter of contract the equal of his master, recognizes himself as the social inferior, and comes to hate his obligations that stem from a “degrading utilitarian reality.”
Ever the hedgehog, Tocqueville provides a discussion of market dynamics tightly consistent with an animating pattern of thought that resonates throughout his work: hyper-individualism and democratic equality lead to the erection of new consolidated powers, embodied both in a distant bureaucratized state and a distant bureaucratized manufacturing aristocracy. Soft despotism and economic consolidation enervate the people, rendering them dependent on a cadre of amorphous, distant elites. As Tocqueville caustically concludes in a passage as prescient as any in the Norman’s oeuvre, “today’s manufacturing aristocracy, having impoverished and brutalized the men it uses, abandons them in times of crisis and turns them over to public assistance to be fed.”
Has Tocqueville’s prediction come to pass in contemporary American economic life? I’m not entirely sure. But I suspect these descriptions of farcical equality in the face of entrenched inequality and mass dependence on a class of removed corporate executives captures to a large extent the anxiety of Walmart employees and the like. Economic life remains a realm where the dissolution and fracture Levin describes are not clearly apparent. As the logic of corporatization infects an ever broader sphere of social life, economic consolidation, not fragmentation, may well remain the central challenge to be overcome.
Levin of course does not have nothing to say about the risks of economic corporatization. Indeed, his analysis of economic individualism begins from the same point as Tocqueville’s—it is the Smithian division of labor that both perfects the worker and (possibly) degrades the man. Yet here again Levin offers the same prescription that will be critiqued at greater length in my next post: “we must use specialization to fight the negative effects of specialization.”
Levin warns in principle of “gigantism in the economy” which can itself consolidate “power over workers and consumers” and eliminate society’s indispensable mediating institutions. Yet while acknowledging the possibility of the problem, Levin does not take seriously its applicability to American economic life today. I wonder why. Haven’t chain stores that regulate employees’ lives with rigorously regimented scripts passed far into the realm of consolidated economic gigantism? Shouldn’t conservatives be open to policy prescriptions that substantially restrain the growth of these efficiency-maximizing corporations? What of employee ownership or stock options, reforms that could create an analogue to mediating institutions within the firm? These seem like the sorts of exciting possibilities conservatives should explore.
Levin’s diagnosis of corporate America’s transformation of workers into consumers is also deeply resonant with the Tocquevillian worries described above. He insists that conservatives need economic policies that will “address Americans as workers” once again, rejecting the enervating passivity of consumerism for the energetic activity of dignified work. The worry that Americans have lost their sense of themselves as workers speaks to a dizzying dislocation in economic life. Men who primarily see themselves as workers embody that traditional American ethic of free labor. Consumers, on the other hand, are passive billiard balls, rebounding about an economic system of prices and incentives.
Given that he grasps the Tocquevillian diagnosis, it is disappointing that Levin doesn’t explore new possibilities of what work could look like. Instead of considering the aforementioned potential of greater worker-participation and ownership as is traditionally demanded by a distributist vision of conservative economic reform, Levin returns to expanded choice as the solution.
He explains that the way to transform consumers back into workers is by developing “portable, individualized benefits and rights that are not attached to workplaces in ways that assume long-term employment relationships.” But this seems to me exactly backwards. The problem is not that workers are too bound to employers, it is that workplace relationships have grown into cold, staid bonds of mere legal contract, rather than warm, humane bonds of intimate authority. The prescription is not to further promote the individual’s atomization but to revitalize communal life within the workplace.
To conclude, this post has raised some objections to Levin’s discussion (or rather lack of discussion) about the consolidated nature of contemporary American economic life. I have argued that the chief economic obstacle is not an excess of diffusion but a glut of corporate centralization. And I have suggested that Levin’s philosophical solution does not adequately address the distinctive social difficulties posed by bureaucratized economic gigantism. Instead of emphasizing choice, Levin would do well to consider new imaginative possibilities to promote greater individual identification with and investment in the workplace.
I pick up this line of reasoning—my critique of Levin’s democratic solution to a democratic problem—in my next blog post.
Yet Gallup research shows that small business start-ups have been in steep decline over the past forty years, while the share of private sector workers employed in small businesses relative to large businesses continues to fall. And though I don’t have empirical evidence on hand that this is the case, I would conjecture that a great many mom-and-pop stores and restaurants continue to be replaced by ever-expanding industrial chains.
Consider also the rise of national social media networks and technological platforms, which Levin praises as building “inherently narrow and personalized networks” that “build up subcultures rather than a mass culture.” Is that true? Today, a few powerful institutions—Facebook, Google, Twitter, Amazon, etc.—command extraordinary centralizing power. These organizations, largely run by recent post-pubescents, have control over more of our personal information than any totalitarian secret police ever had over their citizenry. That does not strike me as a platform for decentralization. Moreover, it seems wrong to think that these tech forums promote subcultural differentiation. To the contrary, in giving terrifying force to an amorphous “public opinion,” these institutions may well cement and strengthen the insidious, homogenizing tendencies of mass (progressive) culture.
But setting my tech-ludditry aside and returning to more traditional economic institutions, Tocqueville (following Adam Smith) saw in the modern capitalist division of labor the same dialectical pattern that drives democratic polities from individualistic premises to despotic conclusions. In particular, the logic of economic individualism destroys the rich mediating bonds and institutions that formerly humanized social relations. Industrialization further divides society into two distinct classes, united only through on an abstract collection of faceless, market forces.
Through the division of labor, Tocqueville explains, as the “workman is perfected” in productivity, so too is “the man degraded.” Economic specialization constructs a new class of industrial elites who, shorn of any aristocratic duty to their workers, grow far harsher than the most brutal of feudal masters. Just as with subjects under soft despotism, the workman grows “dependent on masters in general, but not on any master in particular.” Connected only through contractual agreement to exchange labor for wages, the master-worker dynamic is rendered entirely economic: “the former makes no promise to protect, the latter no promise to defend, and neither habit nor duty creates a permanent bond between them.”
Particularly damning, Tocqueville insists, is the farcical equality now said to define the master-servant relationship. In the marketplace, each party is bound only in virtue of its consent, and in the formal eyes of the law, the two are equals. Yet this “imaginary equality” hardens the hatred between master and servant. For in the “privacy of his soul, the master still deems himself superior,” but no longer an aristocratic guardian, he abandons those traditional “protective and benevolent sentiments that grow out of a long period of uncontested power.” The servant too, though as a matter of contract the equal of his master, recognizes himself as the social inferior, and comes to hate his obligations that stem from a “degrading utilitarian reality.”
Ever the hedgehog, Tocqueville provides a discussion of market dynamics tightly consistent with an animating pattern of thought that resonates throughout his work: hyper-individualism and democratic equality lead to the erection of new consolidated powers, embodied both in a distant bureaucratized state and a distant bureaucratized manufacturing aristocracy. Soft despotism and economic consolidation enervate the people, rendering them dependent on a cadre of amorphous, distant elites. As Tocqueville caustically concludes in a passage as prescient as any in the Norman’s oeuvre, “today’s manufacturing aristocracy, having impoverished and brutalized the men it uses, abandons them in times of crisis and turns them over to public assistance to be fed.”
Has Tocqueville’s prediction come to pass in contemporary American economic life? I’m not entirely sure. But I suspect these descriptions of farcical equality in the face of entrenched inequality and mass dependence on a class of removed corporate executives captures to a large extent the anxiety of Walmart employees and the like. Economic life remains a realm where the dissolution and fracture Levin describes are not clearly apparent. As the logic of corporatization infects an ever broader sphere of social life, economic consolidation, not fragmentation, may well remain the central challenge to be overcome.
Levin of course does not have nothing to say about the risks of economic corporatization. Indeed, his analysis of economic individualism begins from the same point as Tocqueville’s—it is the Smithian division of labor that both perfects the worker and (possibly) degrades the man. Yet here again Levin offers the same prescription that will be critiqued at greater length in my next post: “we must use specialization to fight the negative effects of specialization.”
Levin warns in principle of “gigantism in the economy” which can itself consolidate “power over workers and consumers” and eliminate society’s indispensable mediating institutions. Yet while acknowledging the possibility of the problem, Levin does not take seriously its applicability to American economic life today. I wonder why. Haven’t chain stores that regulate employees’ lives with rigorously regimented scripts passed far into the realm of consolidated economic gigantism? Shouldn’t conservatives be open to policy prescriptions that substantially restrain the growth of these efficiency-maximizing corporations? What of employee ownership or stock options, reforms that could create an analogue to mediating institutions within the firm? These seem like the sorts of exciting possibilities conservatives should explore.
Levin’s diagnosis of corporate America’s transformation of workers into consumers is also deeply resonant with the Tocquevillian worries described above. He insists that conservatives need economic policies that will “address Americans as workers” once again, rejecting the enervating passivity of consumerism for the energetic activity of dignified work. The worry that Americans have lost their sense of themselves as workers speaks to a dizzying dislocation in economic life. Men who primarily see themselves as workers embody that traditional American ethic of free labor. Consumers, on the other hand, are passive billiard balls, rebounding about an economic system of prices and incentives.
Given that he grasps the Tocquevillian diagnosis, it is disappointing that Levin doesn’t explore new possibilities of what work could look like. Instead of considering the aforementioned potential of greater worker-participation and ownership as is traditionally demanded by a distributist vision of conservative economic reform, Levin returns to expanded choice as the solution.
He explains that the way to transform consumers back into workers is by developing “portable, individualized benefits and rights that are not attached to workplaces in ways that assume long-term employment relationships.” But this seems to me exactly backwards. The problem is not that workers are too bound to employers, it is that workplace relationships have grown into cold, staid bonds of mere legal contract, rather than warm, humane bonds of intimate authority. The prescription is not to further promote the individual’s atomization but to revitalize communal life within the workplace.
To conclude, this post has raised some objections to Levin’s discussion (or rather lack of discussion) about the consolidated nature of contemporary American economic life. I have argued that the chief economic obstacle is not an excess of diffusion but a glut of corporate centralization. And I have suggested that Levin’s philosophical solution does not adequately address the distinctive social difficulties posed by bureaucratized economic gigantism. Instead of emphasizing choice, Levin would do well to consider new imaginative possibilities to promote greater individual identification with and investment in the workplace.
I pick up this line of reasoning—my critique of Levin’s democratic solution to a democratic problem—in my next blog post.
Reviewing Yuval Levin's "The Fractured Republic," an Introduction
This is the introduction to a three-part review of Yuval Levin's The Fractured Republic. See also parts one, two, and three of that review.
I have finally gotten around to reading Yuval Levin’s much heralded study of our present political dislocation and philosophical exposition of a conservative political vision. The book is worth the hype. Combining philosophical insight, empirical breadth, and historical sobriety, Levin well-deserves his reputation as the leading intellectual light of the contemporary Right.
The work can fairly be described as an extended meditation on a basic Tocquevillian insight: individualistic atomism and collectivist consolidation are not antagonistic poles, but are rather twin forces marching always in tandem. As I’ve written about before, philosophical individualism leads to the leveling of intermediary authority and the elevation of a distant bureaucratized state as the sole legitimate epistemic and political authority. Levin’s great contribution is to demonstrate that this dialectical theory is no unfalsifiable pseudo-scientific theory of social change, but is instead the basic underlying logic with which we must understand the homogeneity of mid-20th century America and the cultural chaos of the last four decades:
Instead, America’s political malignancy demands democratic solutions for characteristically democratic problems. Only by strengthening the salutary aspects of an individualistic culture can we temper the insidious effects of the same. To that end, Levin calls for a modernized ethic of subsidiarity to overcome the alienation of contemporary social life. By strengthening abandoned mediating institutions of civil society and by cultivating subcultures of shared meaning and value, the best of our democratic tradition can check the worst impulses of our democratic culture. Though the forces of individualistic deconsolidation have been “the chief sources of many of our deepest problems in modern America,” they must also be “the sources of solutions and reforms.”
In the following three blog posts I raise objections. Some of these objections reflect genuine disagreement, while others give voice to unresolved questions I have with Levin’s basic thesis. But though I take these objections to be significant, I want to be clear that this book is among the clearest, smartest, most eloquent works of contemporary conservative thinking I have ever read. Once again, the hype is well-deserved.
My first post considers whether Levin’s sociological narrative of American political fracture makes sense of the great economic consolidation we have seen in recent decades. I suggest that Levin is insufficiently bold in challenging the kind of corporatistic gigantism and mass culture that Tocqueville aptly diagnosed as the economic partner of political soft despotism. My second post will ask whether a democratic solution can really resolve our distinctively democratic problems, and whether choice can satisfactorily address our contemporary crises of social diffusion and bifurcation. And my third post will outline longstanding concerns I have with the Benedict Option, which, building off Rod Dreher’s work, serves as the philosophic center of much of Levin’s thought.
I have finally gotten around to reading Yuval Levin’s much heralded study of our present political dislocation and philosophical exposition of a conservative political vision. The book is worth the hype. Combining philosophical insight, empirical breadth, and historical sobriety, Levin well-deserves his reputation as the leading intellectual light of the contemporary Right.
The work can fairly be described as an extended meditation on a basic Tocquevillian insight: individualistic atomism and collectivist consolidation are not antagonistic poles, but are rather twin forces marching always in tandem. As I’ve written about before, philosophical individualism leads to the leveling of intermediary authority and the elevation of a distant bureaucratized state as the sole legitimate epistemic and political authority. Levin’s great contribution is to demonstrate that this dialectical theory is no unfalsifiable pseudo-scientific theory of social change, but is instead the basic underlying logic with which we must understand the homogeneity of mid-20th century America and the cultural chaos of the last four decades:
The transient balance of midcentury was undone not by the nefarious workings of ill-intentioned partisans of one stripe or another, but by the progress of the very forces that—acting on a highly consolidated nation—had brought that balance about to begin with: the forces of individualism, decentralization, deconsolidation, fracture, and diffusion.Again following Tocqueville, Levin suggests that America’s foundational cultural attachment to individualism cannot be abandoned. Nor can the providential fact of the democratic revolution be undone. Any attempt to restore the nation to a mythologized mid-century moment of unity and consensus as is demanded by our collective political nostalgia would be both politically infeasible and morally undesirable.
Instead, America’s political malignancy demands democratic solutions for characteristically democratic problems. Only by strengthening the salutary aspects of an individualistic culture can we temper the insidious effects of the same. To that end, Levin calls for a modernized ethic of subsidiarity to overcome the alienation of contemporary social life. By strengthening abandoned mediating institutions of civil society and by cultivating subcultures of shared meaning and value, the best of our democratic tradition can check the worst impulses of our democratic culture. Though the forces of individualistic deconsolidation have been “the chief sources of many of our deepest problems in modern America,” they must also be “the sources of solutions and reforms.”
In the following three blog posts I raise objections. Some of these objections reflect genuine disagreement, while others give voice to unresolved questions I have with Levin’s basic thesis. But though I take these objections to be significant, I want to be clear that this book is among the clearest, smartest, most eloquent works of contemporary conservative thinking I have ever read. Once again, the hype is well-deserved.
My first post considers whether Levin’s sociological narrative of American political fracture makes sense of the great economic consolidation we have seen in recent decades. I suggest that Levin is insufficiently bold in challenging the kind of corporatistic gigantism and mass culture that Tocqueville aptly diagnosed as the economic partner of political soft despotism. My second post will ask whether a democratic solution can really resolve our distinctively democratic problems, and whether choice can satisfactorily address our contemporary crises of social diffusion and bifurcation. And my third post will outline longstanding concerns I have with the Benedict Option, which, building off Rod Dreher’s work, serves as the philosophic center of much of Levin’s thought.
Friday, August 12, 2016
On the Uses and Abuses of Analogy
A bit of wandering and link-following the other day (originating from Bill Galston’s phenomenal Liberal Pluralism) led me to Richard Whately’s 1861 Elements of Rhetoric, a work full of insights largely lost on the modern mind. One particular example is a excerpt in the appendix on the uses and abuses of analogy from Edward Copleston’s 1821 An Enquiry into the Doctrines of Necessity and Predestination: in Four Discourses.
Analogy—the drawing out of some similar feature between otherwise different phenomena—is not just the central tool of academic philosophy, it's the way rational beings think about complex matters. It is for this reason that Funes the Memorious' inability to draw abstract connections between distinct memories renders the man incapable of rational thought.
Analogy—the drawing out of some similar feature between otherwise different phenomena—is not just the central tool of academic philosophy, it's the way rational beings think about complex matters. It is for this reason that Funes the Memorious' inability to draw abstract connections between distinct memories renders the man incapable of rational thought.
That's why that commonplace response of moral outrage: “Did you really just compare X with Y” constitutes such a grave obstacle to clear thinking.
Analogy does not mean the similarity of two things, but the similarity or sameness of two relations. There must be more than two things to give rise to two relations: there must be at least three; and in most cases there are four. Thus A may be like B, but there is no analogy between A and B: it is an abuse of the word to speak so, and it leads to much confusion of thought. If A has the same relation to B which C has to D, then there is an analogy. If the first relation be well known, it may serve to explain the second, which is less known; and the transfer of name from one of the terms in the relation best known to its corresponding term in the other, causes no confusion, but, on the contrary, tends to remind us of the similarity that exists in these relations; and so assists the mind instead of misleading it.
Sunday, August 7, 2016
"Master" and the Corporatization of the University
I owe much of these thoughts to series of insightful conversations I had with an academic mentor of mine over the course of the spring of 2016.
Yale's decision last year to abolish the term “master” was denounced by campus conservatives as yet another dialectical development in our contemporary obsession with political correctness and racial sensitivity. The hubbub began when Master Davis of Yale’s Pierson College emailed his students that he would no longer use the title. He reasoned that “master” inevitably conjures up the brutal legacy of American slavery, explaining “there should be no context in our society or in our university in which an African-American student, professor, or staff member—or any person, for that matter—should be asked to call anyone 'master.'”
The conservative response emphasized the deeply ahistorical nature of Master Davis’ thinking and the stupendous silliness of suggesting there is anything racially intimidating about the title of an individual whose primary professional function is to maintain the students’ multi-million dollar playpen. Several searing critiques—two written by friends of mine—argued persuasively that the title “master” derives not from American plantations, but from the halls of Oxford and Cambridge, where the term connoted a healthy respect for authority and wisdom. Yet rather than reflect further upon the history and meaning of the title, campus discussion pivoted to the more politically sensationalist debate over trigger warnings and safe spaces.
The debate over “master” of course does have a great deal to do with debates over race and free speech. But in so emphasizing that aspect of the issue, another deep implication of the controversy was overlooked: the traditional vision of education embodied in the term “master” is under siege by the corporatization of the university.
The charge that Yale is becoming too corporate means all things to all people, as a friend of mine described in an excellent article on the topic. From the Left, “corporatization” refers to the university’s neoliberal, profit-maximizing impulse that comes at the expense of workers’ rights. From the Right, it describes the elevation of efficiency-minded technocratic bureaucrats over professors and scholars.
Importantly, both conceptions of corporatization have much to do with the debate over the title, “master.” Mastery in the collegiate context refers not to political mastery over men, but intellectual and academic mastery over a craft. As a professor of mine put it, the term should invoke the wisdom of a Jedi Master or the artistic genius of Rembrandt and Vermeer, not the brutality of ante-bellum plantation politics. As Adam Smith explains:
We should be wary of ever-greater university centralization precisely because it undermines the traditional purpose of the university's academic structure. To apply the standards of bureaucratic efficiency to an institution oriented toward a radically different set of goods is to make a very basic and very dangerous category mistake. Indeed, the awkwardness of the title "master" in contemporary discourse illustrates our collective forgetting of its traditional meaning. That amnesia accompanies a radical re-conceptualization of the university, leading undergraduates to view of themselves as consumers empowered to customize their education and graduate students to view themselves as laborers deserving union representation.
The point then is that the title “master” is itself a positive good. It should be defended not only because of the ineptitude of its critics, but because the title embodies a rapidly disappearing vision of what college education should be. Just as the consolidation of power in the hands of distant bureaucrats undermines the academic character of the university, so too does abandoning these traditional linguistic bearers of respect and authority lead us ever-further away from that most noble purpose of college education.
Yale's decision last year to abolish the term “master” was denounced by campus conservatives as yet another dialectical development in our contemporary obsession with political correctness and racial sensitivity. The hubbub began when Master Davis of Yale’s Pierson College emailed his students that he would no longer use the title. He reasoned that “master” inevitably conjures up the brutal legacy of American slavery, explaining “there should be no context in our society or in our university in which an African-American student, professor, or staff member—or any person, for that matter—should be asked to call anyone 'master.'”
The conservative response emphasized the deeply ahistorical nature of Master Davis’ thinking and the stupendous silliness of suggesting there is anything racially intimidating about the title of an individual whose primary professional function is to maintain the students’ multi-million dollar playpen. Several searing critiques—two written by friends of mine—argued persuasively that the title “master” derives not from American plantations, but from the halls of Oxford and Cambridge, where the term connoted a healthy respect for authority and wisdom. Yet rather than reflect further upon the history and meaning of the title, campus discussion pivoted to the more politically sensationalist debate over trigger warnings and safe spaces.
The debate over “master” of course does have a great deal to do with debates over race and free speech. But in so emphasizing that aspect of the issue, another deep implication of the controversy was overlooked: the traditional vision of education embodied in the term “master” is under siege by the corporatization of the university.
The charge that Yale is becoming too corporate means all things to all people, as a friend of mine described in an excellent article on the topic. From the Left, “corporatization” refers to the university’s neoliberal, profit-maximizing impulse that comes at the expense of workers’ rights. From the Right, it describes the elevation of efficiency-minded technocratic bureaucrats over professors and scholars.
Importantly, both conceptions of corporatization have much to do with the debate over the title, “master.” Mastery in the collegiate context refers not to political mastery over men, but intellectual and academic mastery over a craft. As a professor of mine put it, the term should invoke the wisdom of a Jedi Master or the artistic genius of Rembrandt and Vermeer, not the brutality of ante-bellum plantation politics. As Adam Smith explains:
All such incorporations were antiently called universities … The university of smiths, the university of taylors, &c. are expressions which we commonly meet with in the old charters of ancient towns. When those particular incorporations which are now peculiarly called universities were first established, the term of years which it was necessary to study, in order to obtain the degree of master of arts, appears evidently to have been copied from the term of apprenticeship in common trades, of which the incorporations were much more ancient. As to have wrought seven years under a master properly qualified, was necessary, in order to intitle any person to become a master, and to have himself apprentices in a common trade; so to have studied seven years under a master properly qualified, was necessary to entitle him to become a master, teacher, or doctor (words anciently synonimous) in the liberal arts, and to have scholars or apprentices (words likewise originally synonimous) to study under him.The real argument that needs to made is one that endorses the vision of education embodied in the language of mastery—the vision of an essentially apprentice-based model of liberal learning. The title should be maintained precisely because it does not connote raw power, but rather expresses the rightly directed reverence and respect students ought to have for their teachers. This sort of reasoning can also help illustrate the conservative objection to a corporate model of university governance. We wish to preserve traditional modes of learning because of the values and ideals they promote. What looks like an inefficiency to the technocratic administrator is in fact an institution purposefully designed to make a distinctive kind of education possible.
We should be wary of ever-greater university centralization precisely because it undermines the traditional purpose of the university's academic structure. To apply the standards of bureaucratic efficiency to an institution oriented toward a radically different set of goods is to make a very basic and very dangerous category mistake. Indeed, the awkwardness of the title "master" in contemporary discourse illustrates our collective forgetting of its traditional meaning. That amnesia accompanies a radical re-conceptualization of the university, leading undergraduates to view of themselves as consumers empowered to customize their education and graduate students to view themselves as laborers deserving union representation.
The point then is that the title “master” is itself a positive good. It should be defended not only because of the ineptitude of its critics, but because the title embodies a rapidly disappearing vision of what college education should be. Just as the consolidation of power in the hands of distant bureaucrats undermines the academic character of the university, so too does abandoning these traditional linguistic bearers of respect and authority lead us ever-further away from that most noble purpose of college education.
Subscribe to:
Posts (Atom)