14 February 2012
Professor Murray must be overjoyed. You know you’ve made a real impact on the culture when the entire editorial staff of The New York Times seems to be taking turns weighing in on your latest offering.
As a rule, David Brooks is one of the most worth-listening-to among that staff. However, it is deeply ironic that he calls today’s column “The Materialist Fallacy,” and blames both Murray and his liberal critics for falling prey to this fault of logic.
The irony lies in the fact that, out of all the NYT‘s editorialists, Brooks is probably the worst offender when it comes to swallowing supposedly scientific claims about human nature and human behavior hook, line, and sinker.
Today’s column is not the worst example of Brooks’s faible for scientism, but it is still pretty rich that he of all people should accuse conservatives like Murray of peddling a “culturally deterministic” theory, when this label fits the research Brooks himself cites at least as well.
In other words, Brooks’s latest column is a case of the pot calling the kettle black, if there ever was one.
Let’s see what he has in mind when he accuses others of committing “the materialistic fallacy.” Here is how he summarizes the findings in Murray’s book:
The half-century between 1912 and 1962 was a period of great wars and economic tumult but also of impressive social cohesion. Marriage rates were high. Community groups connected people across class.
In the half-century between 1962 and the present, America has become more prosperous, peaceful and fair, but the social fabric has deteriorated. Social trust has plummeted. Society has segmented. The share of Americans born out of wedlock is now at 40 percent and rising.
This is a very interesting point, and one that is not stressed often enough. If economic forces were the main factor in the moral decline of the American working, then why has the worst deterioration occurred during the period of greatest prosperity?
Next, Brooks claims there have been three major types of theories advanced to explain the data, from three different points on the ideological spectrum:
As early as the 1970s, three large theories had emerged to explain the weakening of the social fabric. Liberals congregated around an economically determinist theory. The loss of good working-class jobs undermined communities and led to the social deterioration.
Libertarians congregated around a government-centric theory. Great Society programs enabled people to avoid work and gave young women an incentive to have children without marrying.
Neo-conservatives had a more culturally deterministic theory. Many of them had been poor during the Depression. Economic stress had not undermined the family then. Moreover, social breakdown began in the 1960s, a time of unprecedented prosperity. They argued that the abandonment of traditional bourgeois norms led to social disruption, especially for those in fragile circumstances.
But why must we choose among these three theories? It is perfectly plausible to suppose that all three factors played a role in the moral decline that Murray has documented.
Chronic unemployment may not be enough by itself to cause social breakdown, but in combination with the carpe diem culture and the Great Society institutions that put an empty, hedonistic lifestyle within the economic reach of millions, it is not difficult to see how deindustrialization may have played a triggering role.
A loosened rock found its path smoothed by moral and institutional clear-cutting, and created a social avalanche.
But Brooks claims these common-sense ideas have been superseded by social-science research in recent decades. Specifically, he cites four basic findings. This is the first:
Over the past 25 years, though, a new body of research has emerged, which should lead to new theories. This research tends to support a few common themes. First, no matter how social disorganization got started, once it starts, it takes on a momentum of its own. People who grow up in disrupted communities are more likely to lead disrupted lives as adults, magnifying disorder from one generation to the next.
In other words, bad habits are hard to break and get handed down from parents to children. No quarrel from me on that one! But one wonders why Brooks thinks this is breaking news.
Second, it’s not true that people in disorganized neighborhoods have bad values. Their goals are not different from everybody else’s. It’s that they lack the social capital to enact those values.
So, all of those out-of-wedlock babies do not point to a revolution in the way men and women view marriage? Rather, the men and women in question would rush to the altar if only they had the wherewithal to do it up in proper bridezilla fashion?
Of course, Brooks and the studies he alludes to are no doubt right that many people in the communities in question probably do long for a different way of life—do feel in their bones that the lives they are leading are wrong.
But that doesn’t change the fact that they wouldn’t be leading such lives at all if sexual promiscuity had not first been destigmatized. That sea change in the culture around them was a necessary condition for the illegitimacy epidemic.
Third, while individuals are to be held responsible for their behavior, social context is more powerful than we thought. If any of us grew up in a neighborhood where a third of the men dropped out of school, we’d be much worse off, too.
Well, yes. In the aggregate—and questions of individual responsibility aside—the social surroundings of people influence their behavior. Who would dream of denying such a banality?
I would even be so bold as to go further: Both ideas and social policies matter to the lives of millions of people. After all, that’s why we’re having this conversation in the first place.
Please note, in passing, that, if the conservatives’ emphasis on morality is “cultural determinism,” then Brooks’s appeal to “social context” is cultural determinism on stilts.
In fact, neither theory really fits that description, as Brooks instinctively realizes by inserting the caveat about individual responsibility. The term “determinism” is not properly applied to statistical regularities, but rather ought to be reserved for the fallacious claim that individuals have no choice in their actions—that free will does not exist.
It is clear that Brooks does not believe this, and it is doubtful that very many cultural conservatives do, either. So, the whole rhetoric of “materialistic fallacy” and “cultural determinism” is simply misplaced in this context.
The recent research details how disruption breeds disruption. This research includes the thousands of studies on attachment theory, which show that children who can’t form secure attachments by 18 months face a much worse set of chances for the rest of their lives because they find it harder to build stable relationships.
This is the only semblance of a real idea on Brooks’s list. But once again, social science is being brought to bear to lend a patina of unanswerable authority to what is nothing more than common sense.
Who in his right mind ever doubted that it was bad for babies to be neglected?
In his appeals to recent social science, Brooks does not add anything substantive to the debate over Murray’s thesis about the collapse of morality among the white working class in America.
And he certainly has no business accusing Murray and other conservatives of the “materialist fallacy” to which he is all-too-prone himself.
It is not “materialism” or “determinism” to point to the reasons why human beings behave as they do. Of course, human beings act for reasons—we do not just act at random, for no reason at all. But human motivations are not causes operating according to natural law.
And it is certainly not “culturally deterministic,” much less “materialistic,” to point to those motivations that are bound up with morality—or the lack of it.