William Paley claimed that the “university exists to form the minds and
the moral sensibilities of the next generation of clergymen, magistrates, and
legislators.”[1]
The assumption at Cambridge in 1785 was that both “individual conduct and a
social order pleasing to God can be known and taught.”[2] To
know outside of divine revelation what is pleasing to God was typically
considered to be presumptuous back then because human finite knowledge cannot
claim to encompass all possible knowledge. This could not even be claimed of AI
a couple decades into the twenty-first century. Although infinity itself is not
necessarily a divine concept—think of infinite space possibly being in the
universe—it cannot be said that humans have, or even are capable of having,
infinite knowledge. Theists and humanists can agree on this point. So, when a
professor decides that a political issue is so important that using a faculty
position to advocate for one’s own ideology in the classroom, presumptuousness
can be said to reek to high heaven. I assume that any ideology is partial,
and thus partisan, rather than wholistic. Both the inherently limited nature of
the human brain, and thus human knowledge, and the presumption of an instructor
to use the liberal arts, or the humanities more specifically, to advocate for one’s
own ideology were strikingly on display on a panel on what the humanities
should contribute on climate change. The panel, which consisted mostly of
scholars from other universities, took place at Yale University on Ash
Wednesday and Valentine’s Day, 2024. Perhaps on that day in which the two
holidays aliened, both fear of our species going extinct—literally turning to
dust—and love of our species and Earth could be felt. That we can scarcely imagine our planet
without our species living on it does not mean that such a scenario could not
happen; and yet I contend that the humanities should not sell its soul or be
romanticized ideologically to be transacted away into vocational knowledge, as
if the humanities would more fittingly ask how to do something rather
than why something is so. Going deeper, rather than departing from the intellectual raison d’être in order to tread water at the surface, metastasizing into training and
skills, is not only the basis of the humanities’ sustainable competitive
advantage in a university, but also the best basis from which the humanities
can make a contribution to solving the problem of climate change by getting at its
underlying source. Neither a political ideology or skills in “knowledge-use”
can get at that; rather, they are oriented to relieving symptoms, which although very harmful, could be more expeditiously redressed by
discovering and understanding their root cause. So I’m not claiming that
universities should do away with applied science and research on technology,
such as to absorb carbon from the seas and atmosphere; rather, I contend that
the liberal arts and sciences, especially the humanities, should not be turned
into engines of application.
One panelist opened minds in the room to a tension within liberal arts.
While the humanities are liberating for a free person, individual research and
truth-seeking are in tension with forming groups with shared understandings.
Both, the panelist asserted, are part of liberal arts education. When I was a
student at Yale, I applied truth-seeking to theology, philosophy, and history,
and constitutional law; I also joined a debating society, or “political party,”
in the Yale Political Union. Some of the ideas I came up with in my studies
were unique, and I conformed to an ideology in a debating
society (which owned one of Yale’s secret societies). The other members of that
“political party” engaged with me during the debates on ideas stripped of the
usual distracting media-driven sideshows.
Both my own studies and debate in the Yale Political Union held my ideology
in check, though obviously didn’t eliminate it.
In fact, both my ideas and ideology have changed since I studied at
Yale; for one thing, I went on to study historical moral, political, and
religious thought at another university after graduating from Yale. I had moved
from natural science, to business, to the social sciences, and finally to the
humanities. I wanted a firm foundation in the latter. After all, political
economy and economics were once part of philosophy; Adam Smith was a professor
of moral philosophy. It is from my broad educational background, to which I
sacrificed entirely too many years of my youth (without being a professional
student!), that I took in the panel’s opinions on how the humanities should
address climate change. I was not looking through ideological glasses so much
as through those of the humanities and business.
A participant on the panel defended the liberal arts, possibly because
business-oriented “academic” administrators at some schools were shedding parts
of the humanities. The University of
West Virginia, for example, had recently announced that it would eliminate
foreign language study because the students’ eventual employers would not value
that kind of knowledge. Not even administrators at business schools should
delimit courses offered to those which the administrators think CEO’s would like. When I was a graduate student in business at Indiana
University, the dean told the incoming class that we would not use the knowledge we obtain for 10 or 15 years. “We’re not here to train
you,” the dean said. Indeed, more than one CEO told me that they did not want
business schools to train future employees. We can do that; we want business schools do what we can’t do—educate them. Indeed, large CPA firms were hiring
English majors because such students could reason well. Deans of the liberal
arts and science should therefore not listen to the corporate sycophants
running business schools.
Liberal arts “are about questioning and liberating,” one of the
panelists insisted. During World War II,
the liberal arts were criticized in the U.S. for being a luxury that a country
at war simply could not afford. Wendall Wilke, who ran for U.S. president
against Roosevelt in 1940, defended the humanities as the franchise of the mind
to free the mind, and an open mind is beneficial to economic and political
freedom—virtues whose value could be confirmed merely by looking over at Nazi
Germany. Wilke’s claim is in line with Paley’s thesis that the humanities are
useful in terms of individual conduct and social order, whether of a religion,
an economy, a political system, or a university.
If the liberal arts really are about asking why beyond the opinions broadcasted on a public square, and are thus about
freeing the mind from societal constraints, then it could be asked, as one
participant did: “Is climate-change a necessary thing that liberal arts and
science has to do, or is covering climate-change a luxury?” What is the
opportunity cost—the benefit that is foregone—in orienting the liberal arts and
sciences to applied work so as to reduce carbon emissions? If a person loses
one’s soul to be someone else, I submit that the benefit would be less than had
the person stayed true to oneself in making a contribution. The same holds for
academic disciplines. Math students should not be forced to be trained in
accounting. I once worked in that field, and, believe me, I’m no mathematician.
So how can the humanities help us to understand climate change? A
couple panelists distinguished local, national, and global social scales.
Scaling, or viewing the world in terms of different scales, was said to be relevant
to climate-change. One question that the liberal arts and science could answer
is, “Are different scales naturally related?”
I thought of the natural fractals in chaos theory, and the research that
has gone into applying the natural sciences, including evolution, to social
organization. By 2020, the imprint from the aggregated energy consumption of
individuals was clear on the global scale; we had entered the Anthropocene era
in which aggregated individual conduct really could change the natural world on
a global scale. Even so, studying scaling only goes so far; it does not get at
the root cause: why the aggregated individual conduct is now so
detrimental on the global scale scientifically.
Time can be thought of as a
scale. Universities are not necessarily set up for the long term, one panelist
claimed; and yet another panelist pointed out that, traditionally, college has
been seen as a leisure, and thus not something oriented to a demand for
immediate action. Indeed, going to college used to be a luxury because students
could take several years off full-time work in order to become knowledgeable. I
would add being better at reasoning, which training students at skills does not
sharpen nearly as much. This is one reason why I studied at Yale after my
studies in business. It is also why Richard Brodhead, the dean of Yale College when
I was at Yale wrote to the undergraduates in the liberal arts and sciences that
business would not be major. “Let us educate you; we know how to do that. Then you
can go out and get trained.”
Certainly, the humanities are not inherently oriented to serve
immediate action, and yet climate change has become urgent because governments
have not stood up to their polluters. Is this, however, the root cause? Does
losing the climate-change battle boil down to a dysfunctional political economy
steeped in corruption? The humanities can dig deeper than political economy. So
to siphon the liberal arts and basic sciences into serving only that which is
immediately useful can be reckoned as dogmatic both in terms of being arbitrary
and imposed. Einstein was not awarded a Nobel prize for either of his theories
of relativity because at the time they could not be tested empirically. It was
more than ten years after his general theory in 1905 until a solar eclipse
provided empirical support that gravity from a large mass bends space itself. Limiting
theorizing on astrophysics to knowledge that can be empirically tested and is
immediately useful would a the very least be “penny proud; pound foolish.” Such
a foolish litmus test would cut off too much paradigm-changing knowledge. I
don’t think the stricture of immediate usefulness should deplete human
knowledge of the possibility of on-going scientific revolutions; they are hard
enough, as Thomas Kuhn argued. I submit that this also applies to the
humanities. Should they study only those causes that can be immediately acted
upon? At the very least, uncovering a cause of a phenomenon instantly
highlights the symptoms as
symptoms.
Such artificial delimiters as immediate action come from
not only positivists such as Popper in the natural sciences, but also from
business schools, especially in a culture in which business is revered. I
suspect that many humanities professors in the American states are unaware of
how much they have imported not only from business schools, but also from the
business world itself. One of the panelists insightfully observed that education
as (vocational) training is transactional, whereas the mantra of the liberal
arts and sciences is knowledge for its own sake. So it is at a fundamental
level that the litmus test of immediate action is so exogenous, or foreign, to the humanities. Unfortunately, vocational skill had been eclipsing
even basic knowledge at many American universities since the rise of business
schools in the 1980s. American business culture has been so salient in the
societal cultures of many of the American states that even humanities
professors at state universities especially use power-point presentations with
knowledge as bullet-points.
The panelist from Arizona State University taught at the time in the
School of Ocean Futures, which in turn is in that university’s College of
Global Futures, as if they were labels for academic schools of knowledge rather
than being in actuality marketing slogans and ideological jargon. She spoke in
terms of training students in skills on useful knowledge. That only such knowledge is to be taught and
researched there is clear, for the panelist bragged that useful knowledge is literally “etched in the stone” that displays the university’s
mission. It is significant that Einstein would not have been welcomed at ASU.
Furthermore, that panelist spoke of “teams” of students in her classroom,
as if she were a manager at a corporation referring to her subordinates. She
also spoke of the need to turn her students into leaders, by which she meant
practitioners. The term leadership had come to be so vague that it could be
both a vehicle and cover for ideologically-infused agendas. For example, in
business world, “leadership coaches” roam free-lance on a bad metaphor without
even bothering to distinguish leadership from management or supervision. It was
as if that panelist were so ensconced in vocational jargon that she took it as
legitimate for academic knowledge.
It was very clear that she thought that the humanities should only be “applied knowledge.” She bragged that ASU was inaugurating a
“general sustainability” requirement for every student as a means to get them
to display leadership in 20 years. Never mind that sustainability is not an
academic term, and yet ASU has a School of Sustainability (and another of
Leadership). She said that the “sustainability” requirement is meant to “train
students to envision alternative futures will give skills.” She was quite explicit
that training is what university faculty should do. That this might fit a
school of global futures does not mean that her vocational orientation should
be applied to faculty and students in the liberal arts and sciences. Of course,
requiring certain courses with a vocational and ideological agenda comes with
an opportunity cost because other courses that might otherwise be required are not
chosen. Rather than studying one issue, students’ reasoning ability, which they
could apply to any topic, could benefit from requiring a
course in logic. Also, a semester or two of Latin would help immensely with understanding
English grammar, which, along with logic is (as I have found) extremely useful in
writing on a variety of issues. The world needs excellent thinkers rather than just
skill-doers.
ASU at the time had an applied lab in a new building. The technological
research there was on “carbon trees,” which can absorb carbon from the atmosphere.
Such labs are definitely needed, and this need is indeed urgent. Imposing that
on humanities faculty and students is thus not only unnecessary, as it would be
duplicative, but also, as I have already argued, detrimental to the humanities.
Even the priority of ASU’s president at the time on forging lucrative applied
labs with corporations and government can be criticized as detrimental to the
liberal arts and basic sciences. Ironically, post-doctoral researchers in those
labs have complained that the funding by the university has not been sufficient
for a course of research to be followed to a conclusion.
The participant from ASU also admitted that she was unapologetically
ideological in the classroom in advocating her ideology of “stewardship and
responsibility” applied only to climate-change. She also answered a question by
insisting that advocating is a legitimate teaching role because her
issue is “too important.” No ideological faculty member of a university would
need to use much self-discipline in pledging not to advocate on issues that are
not important. In short, her stance would open the flood-gates to going
after “bad” ideologies and promoting one’s own as a significant part of teaching.
It is telling that the panelist viewed critical thinking as being able to
distinguish “true from false information.” When I heard that, I thought of
“fake news.” Even though I believe that climate change is a very important
issue, I don’t see a college teacher’s role as including rebutting statements
made on Fox News.
Recall Paley’s caveat that we presume that our knowledge
is of individual conduct and social orders that are pleasing to God. Viewing
one’s own ideology as true knowledge is antipodal to Paley’s epistemological
humility. The ASU panelist, whose background outside of academia includes advocating
on her issue to Congress, was narrowing, and thus warping, “critical thinking”
to being ideologically opposed to conservatives. Of course, she had not applied
such thinking to her assumption of having true information, or even to her
assumption that her cause is so vitally important that it is worth hijacking
her teaching role to spread her ideology in the classroom. What about nuclear
war? What about AI getting out of hand? What about the impotency of human
rights in the extant global order? Are teachers whose values galvanize to any
of these issues not allowed to turn their respective classrooms into ideological
soapboxes because these issues are not important enough? Strangely, she said
that liberal arts students should come to “appreciate different ways of seeing
things.” Apparently, this is so as long as their views are in sync with her
ideology. I submit that different ways include unpopular, and thus disliked,
ways. To the extent that the status quo itself has contributed to the problem
of climate change, then thinking through alternative paradigms and getting to
the root problem of the extant paradigm is of great value. I submit that the
world needs a lot of different and more fundamental thinking even though
ideological strictures narrow or even block such thinking.
Lastly, the panelist from ASU was dismissive of another panelist’s
suggestion that in the humanities, the question of whether our “virus” species—think
of the film, The Matrix—should survive should be asked. She said
that’s a bad question, so she would bar it even in the humanities. Her prejudice
was clear when she referred to the humanities as being “too Ivory Tower.” Although the event was indeed atop the Kline
Tower on Yale’s campus, the building’s exterior walls were still made of brick
rather than ivory.
So, what can the humanities offer in line with the nature of that knowledge?
Beyond scaling, I suggest that rather than showing ASU students maps of places
that may be flooded from climate change, the distinction between a cause and
symptoms should be studied. Going beyond the latter would be extremely
beneficial to the survival of our species. I would include study of Thomas Malthus’s
1798 text, Essay on Population, because the expediential increase in (over)
population during the twentieth century is arguably the root cause of climate
change. Deep thinking, rather than being trained on decision-making skills, not
only is the forte of the liberal arts (and sciences), but paradoxically can
also leap over policy and technology in revealing the underlying problem, which
is a prerequisite to really solving the problem rather than merely addressing
admittedly injurious symptoms as they crop up. Malthus claims that a species’
population can outstrip its food supply, and I would add its energy supply. If
we don’t self-regulate our species’ population, nature will step in, whether in
pestilence, disease, or war, according to Malthus. We could add to these three a
shift in the equilibriums of the global climate and ecosystems beyond the
habitable zone for homo sapiens. We, the wise human species, likely have innate
and learned attributes responsible for the astonishingly fast growth and size
of human population on earth. So we can go even deeper than overpopulation by focusing in the humanities on its deep knowledge. Even theory development in the social sciences can be done in part to rectify the flawed institutional political and related economic systems and cultures that have enabled the explosive population growth. If our
species of homo sapiens really is wise (sapientia), then reducing
knowledge to skills that are the immediately useful seems perplexing to me.
Ironically, going deeper has a better chance of solving a myriad of problems
beyond a few quick fixes.
1. A. M. C. Waterman, Political Economy and Christian Theology since the Enlightenment: Essays in Intellectual History (New York: Palgrave Macmillan, 2004), p. 211.2. Ibid.