By Victor Davis Hanson
Tuesday, December 18, 2018
The great culture wars on the campuses of the 1980s were
largely lost by traditionalists. And the question then became not if but when
the liberal arts would die off as a result. What is strange nearly 40 years
later is that the apparent outrage over what was clearly foreordained is now
becoming fact. What did academia expect, given its years of academic
specialization and politicized indoctrination?
Recently the University of Wisconsin–Stevens Point
announced plans to drop liberal-arts majors in geography, geology, French,
German, two- and three-dimensional art — and
history. The Atlantic ran a
well-meaning essay by Adam Harris on the controversial move, “The Liberal Arts
May Not Survive the 21st Century” — again, a topic much in the news recently.
The article’s chief thrust is that insidious efforts to promote STEM
vocationalism — the need to prepare young people for careers requiring
extensive math and science skill sets — has driven out the need for more
in-depth focus on the liberal arts, in a climate in which crass Republican
state legislators, in allegedly vindictive and short-sighted fashion, demanded
catastrophic cuts in state public higher-education budgets.
The Stevens Point campus highlighted a popular perception
that emphases in literature, history, or languages lead nowhere for
cash-strapped graduates but to more debt and fewer jobs. Yet what the article
on official university policy misses is why
students do not concentrate in the liberal arts in the fashion of the past.
After all, only that fact of declining enrollments allows
the university to institutionalize the unspoken reality of eroding student
interest. In other words, the university is simply burying liberal-arts majors
that were already killed off not by bottom-line-minded state legislators but by
the choices of either students or faculty or by university policies, or by combinations
of all three.
If higher education’s increasing fixation on job training
is the whirlpool that swallows history majors, the monster across the narrow
straits of liberal-arts education is a many-headed politicized orthodoxy, a
Scylla that consumes the flesh of the liberal arts and leave the bones as
dreary reminders of boilerplate race, class, gender, and culture agendas. In
the case of history, few increasingly wish to sit in a class where the past
becomes tedious melodrama rather than complex tragedy, a sort of reeducation
camp in which modern standards of suburban orthodoxy time-travel to the past in
order to judge materially impoverished historical figures or pivotal events as
either culpable or exonerated.
The tragedy, then, is not just that a campus of the
University of Wisconsin would drop the history major but that the custodians of
history in the 21st century lost the ability to teach and write about history
in a way that sustains a hallowed 2,500-year tradition. In other words, what is
being jettisoned is likely not just
history as we once understood it but rather de facto poorly taught “-studies”
courses — which sadly become snapshots of particular (and often small) eras of
history — designed to offer enough historical proof of preconceived theories
about contemporary modern society. The students then are assumed by the
course’s end to be outraged, persuaded, galvanized, and shocked in politically
acceptable ways. Usually they are just bored, as supposedly with-it professors
endlessly regurgitate the esoterica picked up in graduate schools.
Of course, not all historians see the past as an orthodox
way of fixing the present, but enough do to discourage students, especially
when younger faculty members draw on their rather specialized doctoral theses
or narrow journal-article expertise to drive home an agenda that seems preachy
or proselytizing to naturally resistant young spirits. To the Millennial mind,
calcified Sixties-era radicalism is about as edgy as once was the Stalinist
1930s Old Left sermonizing to the Woodstock crowd. Trendiness that once pleased
faculty committees and careerist deans did not always please students, and
therefore the result is now not so pleasing to faculty committees and careerist
deans.
Once a student has signed up for a class on the
Renaissance or the Great Depression and quickly learns that it can become a
periodic harangue on the oppression and victimization of particular
marginalized groups, she will likely not wish to repeat the experience on money
borrowed at between 5 and 7 percent interest, or to be convinced that her
future employer wishes to be woke by a heady 21-year-old. The irony about the Atlantic article is that when it quotes
liberal-arts and history professors to document their outrage at the Wisconsin
cuts, their defense of their fields become perceptions of how history is
necessary to advance particular contemporary agendas. So, for example, we are
told that “in mid-November, the university announced its plans to stop offering
six liberal-arts majors, including geography, geology, French, German, two- and
three-dimensional art, and history. The plan stunned observers, many of whom argued that at a time when
Nazism is resurgent society needs for people to know history, even if the
economy might not” (emphasis added).
So “stunned observers” offered a touché to both today’s
Nazi apologists and right-wing money-grubbers!
In a utilitarian sense, students certainly can benefit
from becoming aware of Nazi-like dangers by studying history. Unfortunately,
few universities offer courses in World War II, which might most effectively
offer a variety of explanations of why Nazi Germany was able to absorb most of
Europe and trigger what would become a global conflict that cost 65 million
lives.
But when one looks at the Wisconsin campus catalogue, one
seems to find few if any classes in
World War II. The closest might be “Women, War and Peace,” “Dilemmas of War and
Peace: An Introduction to Peace Studies,” or “War and Propaganda in the 20th
Century.” No doubt such offerings might be great courses, but I don’t think
they would cover fully the Nazi aggrandizement of the late 1930s, particularly
the role of Soviet collaboration, British and French appeasement, and American
isolationism, or the tragic circumstance of the Munich Agreement — in other
words, the likely best way for students “to know history” of any purported
contemporary Nazi ascendance.
Nor I am sure that by agreement we live in a time “when
Nazism is resurgent.” Certainly the world’s most frightening societies are
North Korea and Venezuela, where wide-scale poverty and government oppression
are normalized. Both are failed Communist states. The current likeliest threat
to the global order for future generations of liberal societies will be statist
and authoritarian China, whose government is still proudly Communist in a
tradition that includes Mao Zedong’s 50 to 70 million dead. The point is that
if students are interested in riveting history classes, they will probably not
wish to be told that they should so enroll in one because “Nazism is resurgent”
in today’s West.
Harris in the Atlantic
article also notes that “the chairs of each department at the University of
Wisconsin at Stevens Point were assessing their programs ahead of a biweekly
meeting with the dean.” The chair of the history department “may not have been
feeling great, but he was at least upbeat. ‘I felt like the department had
really diversified its curriculum in a way to shore us up.’”
One wonders what exactly “diversified its curriculum”
means — the greater inclusion of now rare military, diplomatic, or political
history, or more-diverse intellectual approaches or a greater variety of
liberal and conservative historians? Or does the reference to “shoring up” suggest
ever narrower race, class, gender, and environmental courses, which deductively
seek to use the past to lead students to preconceived contemporary agendas —
and thereby so often erode students’ natural interests in history?
Yet if one walks through the local Barnes & Noble
bookstore, reviews the non-fiction best-seller lists, scans Amazon’s most-read
categories, or looks through book ads in popular magazines, one is struck how
well biographies of Churchill and Grant sell, and how histories of war and peace,
exploration, and political careers capture the public interest — reminding us
that the fault of declining college interest in the liberal arts may be not in
the stars of vocationalism or the wrong values of students but rather deep
within the university faculty and administration themselves.
Twenty years ago, John Heath and I co-authored Who Killed Homer? The Demise of Classical
Education and the Recovery of Greek Wisdom, a failed warning that formal
university study of the classical past was dying off, caught between
vocationalism, or narrow specialization, and politicization of both the
classics curriculum and faculty research. In other words, at the very time it
was becoming necessary to offer the university a counter-argument that liberal
arts do enrich the education of business and tech grads and can make them both
more-aware citizens and better managers and engineers, liberal-arts faculty in
so many cases narrowed their fields, employed a new off-putting jargon, and
recalibrated the past in monotonous fashion as a primer on victims and
victimizers. And students as a result walked, as academics killed their own
field and blamed their suicide on larger sinister forces in society.
Often, the choices of young professors and scholars are
limited by career concerns and limited time. If a young, recently minted
classics or history Ph.D., lucky enough to land a temporary lectureship, had to
make an either/or choice between tutoring all his struggling students in Latin
1A, along with giving community lectures on Spartacus or Thermopylae at the
local Rotary Club and junior high, or
writing a journal article on the perspectives of ethnic, gender, and race
relations and their empirical referents in Roman cults in Asia Minor, or even
another scholarly note on abnormal uses of the optative mood in Diodorus,
choosing politicization and specialization would be the far wiser career move —
even as participating in broad community and undergraduate outreach, in this
endangered climate, would be the wiser investment to save an entire discipline.
The provost in the Atlantic
article is quoted as rightly asserting, “We’re facing some changing enrollment
behaviors. . . . And students are far more cost-conscious than they used to
be.” But the reason students are more “cost-conscious” than they used to be is
perhaps because in aggregate they currently owe about $1.5 trillion in student
loans. And that staggering and unsustainable debt is largely because colleges
and universities on average have jacked up their costs per annum higher than
the annual rate of inflation, perhaps in part owing to federally guaranteed
student loans that encouraged undisciplined spending (whether for ever more
diversity czars and assistant deans for inclusion or for Club Med student
unions).
These price hikes were not accompanied by any guarantee
that students on graduation would read, think, and compute far more effectively
than when they were admitted, or at least at a level necessary to ensure
employers that graduates of four-year colleges had more advantageous assets
than those with far less liberal-arts education. The university was largely
unable or unwilling to reconsider orthodoxies such as lifetime tenure, national
exit exams for granting the B.A. degree, the inordinate number of units taught
by exploited part-time faculty, or the allowance of an M.A. degree in math,
science, and the liberal arts to substitute for the teaching credential.
Over the past few years, lots of employers have privately
concluded that today’s graduating liberal-arts majors are quite confident and
yet so often poorly educated. Or worse, those hiring were turned off by the
strange combination of youthful ignorance and arrogance. Employers had clearly
no desire to be enlightened by fresh graduates who were entirely unaware that
their inductive skills were suspect or nonexistent.
No comments:
Post a Comment