Thursday, November 20, 2014

Ethical Theory in Business Ethics Courses

It may seem like an oxymoron, but faculty administrators at even research universities can be hopelessly narrow-minded regarding knowledge and how it is to be conveyed. For example, how often are faculty members encouraged to give a lecture or two re-teaching material largely missed on exams (followed by another, shorter examination on that material)? Do faculty administrators work with faculty members in professional schools to see to it that the applied courses are not severed from their basic (i.e., more theoretical substratum) discipline? One of the secrets in the “sauce” at Yale’s professional schools (e.g., Law, Divinity, etc.) is this salience of the respective basic disciplines (e.g., political theory and theology, respectively). Synergy comes gushing through once the false dichotomy is recognized. Before I went to Yale, I was a masters and doctoral student at the University of Pittsburgh, where the dichotomy was alive and well in the university’s social reality; I had to “walk back” the dichotomy myself as I discovered philosophy (and religious studies) while I was still studying business.

Business ethics was one of my doctoral fields of study at Pitt. The philosophy department there was at the time one of the best in the U.S., so used an elective to take my first course in the discipline. I began with two intro courses; before I knew it, I was taking junior and senior courses, such logic and philosophy of mind. The latter course turned out to be the most intellectually intense course I took in my 18 years as a university student (had I discovered philosophy in college, I would have three rather than five degrees). It occurred to me at the time to start taking ethical theory courses, as business ethics was one of my doctoral fields. Within philosophy, I gravitated to practical philosophy—in particular, to ethics, political theory, and philosophy of religion. I treated these as foundations for the field of business, government, and society in business.
It dawned on me that none of the business doctoral students concentrating in business ethics had taken an ethical theory course in philosophy. That is to say, I was stunned to find a subfield of ethics reduced to management. Ethics proper is a subfield of philosophy, not business; ergo, business ethics is ultimately grounded in philosophy, with managerial implications. I think business schools have put the cart before the horse and letting go of the horse. A cart without a horse isn't going to go very far (though perhaps it can go in circles).

From my educational experience, I contend that ethics courses in business schools ought to emphasize ethical theory, with managerial implications/applications used as much to illustrate the theories as to understand the ethical dimension of business. Managers in the business world have told me that business schools should do what corporate training cannot, rather than being duplicative. I think deans miss this point, perhaps because they are so oriented to sucking up to corporate managers in order to get corporate donations. In my own thinking, theory enlivens rather than detracts from praxis. I think business school faculties are in the grips of the false dichotomy. Corporate managers would doubtless admit that they are ill-equipped to teach ethical theory. Moreover, training is a better fit with what corporate folks do. Business schools, or else philosophy departments, could offer regular as well as continuing education courses in business ethics with ethical theory readings, lectures, and discussions going beyond the superficial “rights, utility, and justice” hackneyed reductionism of business ethics courses in business schools. 

Sunday, November 2, 2014

Exclusivism Eclipses Veritas at Yale

Michael Simons, head of the cardiology department of the School of Medicine, made unwelcome sexual advances in writing to Annarita Di Lorenzo,  a researcher 18 years younger at the school in 2010. Simons wrote that he wanted to kiss the woman’s lips, and every part of her “body in every continent and city of the world.”[1] Referring to Frank Giordano, the woman’s boyfriend at the time and subsequently husband, Simon wrote that the woman was choosing the wrong man.  Simons would keep Giordano from important meetings and assignments. The relationship between the two men became so difficult that Jack Elias, the chairman of medicine, took over the direct supervision of Giordano to protect the untenured instructor from Simon.

Nevertheless, Yale’s provost, dismissed a university committee’s recommendation that Simons be permanently removed from his position in favor of an 18-month suspension. Faculty members claimed that Simons’ success in snagging $5 million annually from the U.S. Government in grants in 2012 and 2013 had been a factor, as well as the fact that the provost had been chair of the economics department, where Simons’ wife was a faculty member. The monetary element would not be lost on virtually any academic administrator at any university, but the “old boys club” sticky web of connections at the elitist Yale could mean that “outsiders” suffer considerable abuse there; the Provost’s dismissiveness of the university committee’s recommendation is but one indication of how distorted the moral compasses can be among the most powerful in the “club.”

As most Yalies undoubtedly know, incremental exclusivisms exist within Yale; the pleasure for a faculty member in being selected to a higher post and a student in being “tapped” to join one of the secret societies strangely depends in no small measure on being able to see others excluded (i.e., hurt). In other words, rather than finding the sheer opening of a door to be fulfilling, the true pleasure lies in watching the door hit others in the ass the hinges swing back.

For example, as a student at Yale, I spent an “all-nighter” going through the admissions ritual of one of the debating parties in the Yale Political Union. Once I was in, the party’s chairman suggested that I attend a party that the party would be giving on an upcoming Friday night.

“The party owns a secret society,” he explained to me, “so you and all other male members of the party can join the society. This Friday at the party we will be tapping people so you will want to be there.”

Assuming that I would be able to join the society too, I cancelled my pre-existing plans and attended the party, which was in a room half way up the university’s clock tower. In actuality, only three people were tapped. They had all been officers in the party’s elite, so I quickly realized that the chairman had wanted me to attend not because I would be tapped, but, rather, in order to watch his friends being selected. The chairman subsequently lied to party-members—saying that I had misunderstood him. Whereas most party members in such a situation typically resigned with great fanfare during one of the weekly debates, I did not resign; instead, I simply had nothing to do with the party or its little cadre of officers. To this day, I am a member of a party in the Yale Political Union.

My point is that deceit used to protect the club within the club at Yale is not a rare deed. Perhaps you can grasp more fully why De Lorenzo may have felt no avenue left to her but to leave the university,  why her husband’s career was effectively stalled by a jealous faculty administrator, and why the provost dismissed the committee’s recommendation. The pyramidal exclusivism within Yale is so strong that unfairness and even outright aggression may seem justified to some.




1. Tamar Lewin, “Handling of Sexual Harassment Case Poses Larger Questions at Yale,” The New York Times, November 1, 2014.

Thursday, September 18, 2014

Can Ethical Leadership Be Taught?

Can ethical leadership be taught? In the typical business school, this question would be interpreted, or “refurbished.” Can students be trained to become ethical leaders? While often conflated contemporaneously, these two questions are indeed distinct. Instructors, professors and school administrators should first decide which question is more relevant to their purposes. The question chosen should fit with the education, pedagogical method, and philosophy of education of not only the instructor or professor, but also the school itself. In this essay, I distinguish the two questions in order to unpack them with their full significance.

The question, Can ethical leadership be taught, can be interpreted as being centered on knowledge of the concept and theories of ethical leadership. Can this particular knowledge be taught? That is to say, if a student were to ask, What is ethical leadership? could the instructor or professor answer with a definition? Have scholars even come up with an agreed-up definition? More broadly, how does ethical leadership as a concept differ from that of leadership more generally? Do theories of ethical leadership explain it rather than merely being oriented to how to? Furthermore, do any extant theories relate the concept to other, related concepts such as strategic leadership or even strategy? If so, can such theories be taught to the students at a particular level of education? Last but not least, would teaching the theories toward an understanding of what ethical leadership is be in line with the approach of the particular business school? Some schools are more commercially-oriented than others.

The pedagogical goal in line with the question of whether the knowledge we have on ethical leadership can be taught is that students know more about ethical leadership as a phenomenon. Studying ethical leadership in applications, as for instance through case studies, is oriented here to understanding more about the concept through how it is exercised in practice. To be sure, praxis (i.e., in practice) cannot capture the entirety of a given concept. Accordingly, the case studies and even in-class simulations are secondary here.

Moreover, whether or not the students are more able to become ethical leaders themselves is theoretically separate even if, as Plato wrote, knowing to do the good is all one needs to do it on a regular basis. In other words, understanding what ethical leadership is may contribute toward a student being able to practice ethical leadership, but this byproduct is not the immediate goal in getting students to understand what ethical leadership is. In this perspective, moreover, a business school’s faculty studies the phenomenon of business so as to understand it better.

Alternatively, a business school may be oriented to training its students to be practitioners of particular skills in business. Here, the question of whether teaching ethical leadership can facilitate or enhance a student’s ethical leadership skills in practice is relevant. Crucially, we have shifted qualitatively from education to training—from knowledge to skills.

In their article on teaching ethical leadership, Carla Millar and Eve Poole point to character, or moral intuition, as being improvable in the classroom setting if experiential learning is used.[1] Whereas case studies involve examining how someone else exercised, or should have exercised, ethical leadership, experiential learning focuses on the decision-making in the student. “How would I handle this situation,” according to the authors, stimulates a student’s own moral compass, which in turn can facilitate the student’s skills in ethical leadership. Here, how is the operative question word, as it befits skill as primary.

Millar and Poole can be criticized for treating a simulation in a classroom as a “real” situation. Relatedly, lab experiments in psychology are not in the “real” world, so the artifacts of the lab environment must be taken into account in analyzing the data. Even if a simulation or putting the students in the place of an ethical leader in a case study gets the students to get an experiential sense of how to be an ethical leader, it is quite a leap to propose that such classroom activities make students more ethical or more inclined to be ethical should they become leaders. Ethical leadership is not like typing. Therefore, I think Plato’s approach is superior even though it puts knowledge of a phenomenon first. This would be ironic.

In terms of institutional conflicts of interest, being able to recognize them in practice, which is presumably preliminary to an ethical leader being able to avoid them, depends on first knowing what an institutional conflict of interest is. What is the relation between the two roles in a conflict of interest? How do they differ? What is a conflict of interest? Is such a conflict unethical even if it is not exploited?  The scholarly literature is split on this last question. Using case studies to learn how to handle institutional conflicts of interest depends on the student to generalize across a sufficient number of cases to figure out what a conflict of interest is. Most likely, however, an instructor simply assumes that presenting a few conflicts of interest to the students is sufficient for recognition purposes, and thus for being able to deal with them. The problem with this approach is that understanding in necessary for recognition beyond the few cases a student has studied. Moreover, studying what a conflict of interest is makes it more likely that a student will take them more seriously, ethically speaking.

It should be pretty obvious where I stand with respect to the two approaches. I contend that paradoxically learning what we know of an applied concept is the best means of enabling students to use the concept in praxis. Abstract knowledge and praxis are synergistic rather than antithetical. If I am correct here, then business school deans who are under the impression that companies want the schools to train future employees are not even serving those companies well by replacing education with training. Generally speaking, if something looks too convenient, it probably is better to take a more arduous route.




[1] Carla Millar and Eve Poole, “Business Schools Are Failing to Teach Ethical Leadership,” The Guardian, November 26, 2010.

Sunday, August 24, 2014

The Sopranos Hit Higher Education: A Case of Academic Politics on Steroids

Academic politics is a staple of university life. What happens when the blood sport gets out of hand may surprise even hardened university heavy-weights, not to mention the general public. I contend that the sordid fraud and other aggressive lies are of the sort that, when manifesting societally, can easily trigger political protests, which in turn can turn enforcers of the law into violators with impunity. The dynamics were on full display in Ferguson, Missouri in August of 2014. A common denominator does indeed exist: the propensity of human nature to abuse a monopoly of power and to view other people as objects rather than ends in themselves.

The entire essay is at "Political Protests"

Saturday, August 16, 2014

Business Culture Forming Higher Education

“Publish or perish” is the infamous mantra of those intrepid scholars who work at research universities and many prestigious Liberal Arts & Sciences colleges dotting the map of the world. The need to demonstrate regular output is perhaps nowhere more stressed (hence, stress) than in the United States. As if the declining number of tenure positions (amid increasing reliance on adjuncts, not coincidentally) at colleges and universities in the U.S. were not enough of a challenge for the newly-minted doctors aspiring to the intellectual freedom that goes with the protection of tenure, that the young scholars are increasingly being subjected to an "assembly-line" process wherein faculty administrators treat their junior colleagues' published journal articles like chocolates on a conveyer belt puts scholarship at odds with itself and thus is utterly self-defeating from the standpoint of society gaining new knowledge.

Notre Dame’s business school, for example, requires two academic articles per year of its scholars who come up for tenure. That particular regiment teems with the oily scent of production management awkwardly being applied to a sort of ethereal assembly line that mechanistically combines ideas at regular intervals. As if subprime mortgages going into financial derivative securities (i.e., bonds), the compound ideas are themselves bundled into at least two major “salable products” per year, like clockwork. Such a rationalized, linear process hardly befits the mind of a scientist, social “scientist,” or philosopher. The process of ideational development (i.e., theory construction) can require sudden bursts of inspiration and imagination, stimulated by some good old-fashioned mind-wandering aided by the gentle push of reasoning.[1]

 Can scholarship be "produced" as though the ideas were on an assembly line?    
Image Source: www.prophecy1.com

Cramming a scholar’s methodology through a “meat-grinder” business model is like forcing a square peg into a round hole. Superimposing “cookie-cutter” production management tools onto scholarship can be expected to result in mediocre work that only incrementally adds to an existing paradigm. Put another way, the approach is not apt to result in another Kant or Einstein, and that's a pity considering how much computer technology could facilitate the scholarship of a great mind.

Enabling the category mistake, not just a few business schools pretend to be businesses, trimming the fecundity of true scholarship off like scrap metal in a factory. Primped-up grizzled suits (i.e., professors who play the part of “corporate executive” aided by a corporate script) pass like whispers through muffled corridors of fresh carpet to train future expert-consultants and managers with the aid of easily-memorizable power-point presentations standing in for lectures. Returning to well-ordered offices with comfortable chairs and only a few carefully stacks of papers on the desk-tops, the “scholars” write what they hope will be pleasing to the practitioners who read the “professional” journals and hire consultants.

With the deepening and widening of human knowledge being presumed readily regularizable temporally into distinct units of production, moreover, the work of a university employee ostensibly tasked with scholarship succumbs to the hegemony of business values. Those include the sheer constancy of monetized worth, as effected, for instance, by the regular output of ideas or automobiles. Moreover, monetization is held aloft as the criterion, fully capable of turning every thing (and person) into a commodity.

I suspect that people around the world, particularly in Europe, would graciously point to the hypertrophic importance that Americans attach to barren metal (i.e., money) and business mores or practices (e.g., being “professional”). In fact, this over-reach of business values in American culture may be spreading to Europe in the guise of avant-garde trends.

Months before the arrival of the turbulent twentieth century, John Watson (1850-1907), a European theologian and clergyman publishing under the pseudonym of Ian Maclaren, put his prescient observations of American culture to paper:  


“The friendly visitor to the United States, who is proud of her achievements and delighted by her brightness, stands aghast at the open and unabashed front of secularity. It seems to him as if not merely coarse and unlettered men, whose souls have never been touched, either by religion or by culture, but that all men, with a few delightful exceptions, bow the knee to this golden calf, and do it homage. Nowhere is there such constant and straightforward talk about money, nowhere is such importance attached to the amount of money which a man has acquired or possesses, nowhere is it taken so absolutely for granted that the object of a man’s work is to obtain money, and that if you offer him money enough he will be willing to do any work which is not illegal; that, in short, the motive power with almost every man is his wages. One is struck, not so much by what is said in plain words (although a dollar is a monotonous refrain in conversation), as by what is implied, and what is implied is this: that if you know the proper sum, any man can be induced to do what you want, even though his health, and his rest, and his family, and his principles, stand in the way.”[2]

More than a century later, European society would be more secular than its American counterpart. Proportionately, many more Americans attend a religious service on a weekly basis. Additionally, the explosion of mega-churches in several of the U.S. states did not translate into a corresponding “growth” in E.U. states. Even so, Watson’s description of the American obsession with income and wealth as necessary and decisive in the valuing of others and oneself resonates with what I have observed. Judging from the reactions of people from abroad, I get the sense that many Americans—especially those working in the business world—do indeed go overboard rather uniquely in reducing personal value to that which can be measured by how much money someone has in a bank account. Just as Watson intended to help rather than beat up on American society, I too want to see more balance as befits the heterogeneity of values that is at home in human nature.


1. Hence the “absent-minded professor” label.
2. Ian Maclaren, “The Shadow on American Life: An Impression of a Recent Visit,” The Outlook, 63 (September 9, 1899), pp. 116-18.

Tuesday, February 11, 2014

Teaching Business Ethics Beyond Skill and Tactic

At the end of March 2010, Warren Buffet spoke to business students at Columbia University.  One asked the billionaire whether being ethical in business comes from how a person is raised or out of a business ethics course.  Buffet quickly answered that it is learned in the home. I concur.

A business ethics course properly understood is not prescriptive; rather, knowledge of ethical principles is learned and applied to topics in business as well as to business systems (domestically and comparative). Although understanding itself can change behavior, academic study is not geared to making a student more ethical. The focus, in other words, of an instructor or professor of business ethics ought to be on imparting knowledge rather than skills that somehow turn people into ethical human beings.

Following bullet-points on how to make ethical decisions, for example, does not make one ethical, just as the manipulation of symbols according to rules does not constitute understanding. Rather, for a person who is inclined to consider the ethical dimension at work or more generally, knowing more about ethics can come into play.

It follows that business ethics is not a skill or a list of tactics; education is not vocation. Paradoxically, understanding ethical principles and explaining their relevance to business is of value not only academically in terms of a college education, but also practically in terms of a career. In other words, knowing more about something—being able to explain it—is best for one’s practice.

For example, explaining why (not just how) there is a the structural conflict of interest in the rating agencies getting paid by the issuers—indeed, knowing what a structural conflict of interest is—facilitates recognizing such conflicts when they occur. Such a recognition is essential for a business practitioner who wants to make sense of an uncomfortable situation that is in actuality an institutional or individual conflict of interest. Moreover, if more of a given electorate understand and can recognize conflicts of interest, it is more likely that the elected representatives will enact legislation that obviates such conflicts. A viable republic is predicated on an educated and virtuous electorate.

That the Dodd-Frank Act of financial reform retains the issuer-pays system—relying almost exclusively on the rating agencies’ internal controls to somehow counter the temptations from the system itself—may mean that neither the general public nor the members of Congress sufficiently understood the nature of a structural conflict of interest when the law was written and enacted. Had more constituents understood the structural depth of the conflict in the issuer-pays arrangement itself, perhaps Sen. Dodd and Rep. Frank would have been able to resist the pressure from Wall Street and the rating agencies in order to root out the underlying cause of the conflict of interest.

Obviating the conflict of interest involving the rating agencies may not be as easy as it seems; even an investor-pays arrangement could involve a conflict of interest in that rating agencies would have an interest in rating an issue such that enough investors purchase it that the agency can realize a profit. Analyzing various alternatives for indications of institutional conflicts of interest would be a good means for students of business ethics to gain a better understanding of the nature of an institutional conflict of interest.

In short, short-circuiting knowledge by reducing education to vocation is paradoxically not in the interest of business; a business school in a university is not a corporation’s training center. Therefore, business professors and instructors should not conflate what they are with what they are studying and teaching. Even in terms of consulting, coming from a vantage-point that does not duplicate a corporate perspective (yet can relate knowledge to it) can be particularly valuable to managers. Ethically speaking, the most expedient route is not always the best of all possible worlds.

Higher Education: A Public Good or Commodity?

According to the New York Times, the American states, much more so than their European counterparts, began a gradual trend of what the paper calls “withdrawal” from higher education during the recession in the early 1990s. It is perhaps more accurate to say “decreased financial support” as the governments have not cut off their own universities. Even so, the change has been significant. The Times reports that the governments’ appropriations for their colleges and universities decreased by 7.6% in 2011-2012. This represents the largest annual decline in at least the previous fifty years—since the time of JFK—according to a report issued by Illinois State University. As of 2012, Arizona had decreased its higher education budget by 31 percent since the recession began in 2007. The housing market decline hit Arizona hard. I would not be surprised to find similar numbers in California, Florida and Illinois. California’s decreased spending on education in general was well publicized in 2011 by Jerry Brown, the head of state and chief executive.

As a result, even as universities have found ways to hold costs per student relatively steady—hovering around an inflation-adjusted $10,000 in the 1985-2012 period—the share of instruction costs paid for by tuition (aside from financial aid!) nearly doubled to 40 percent from 23 percent. The head of the University of California at Santa Cruz admitted that the reality is that students are paying more and getting less.


The trend evinces a shift from the belief that societies (or polities) benefit from higher education to a belief that the people receiving the education benefit primarily and thus should foot the bill, according to a trustee of the State University of New York. In their letters, John Adams and Thomas Jefferson agreed that an educated and virtuous citizenry is vital to a republic. By educated, they did not just mean in civics. Indeed, the ensuing tradition in American higher education has been that a broad foundation in the liberal arts and sciences should be in addition to, rather than merely in lieu of, an undergraduate degree in a professional school such as law, medicine, education, or business. As a result, an American physician or lawyer has two undergraduate degrees, whereas his or her European counterpart has only an undergraduate degree in medicine or law. The relative emphasis on liberal arts and sciences in America reflects the view of Adams and Jefferson that societies not only benefit from a broadly educated citizenry, but if the form of government is that of representative democracy (i.e., a republic), then such education is necessary. The shift in belief toward higher education as a product to be purchased like a car or soap goes against the tenet of the Founding Fathers and thus does not bode well for the continued viability of the American republics in the twenty-first century.

I would add an economic argument that treating healthcare or years of full-time higher education as though they were products to be purchased on the market like an ipod or smartphone misapplies the market mechanism because of the huge gap between the prices of healthcare and education and the wherewithal of the typical consumer to cover them financially, even in the form of debt. Just because communism fell to capitalism does not mean that absolutely everything is justly or even feasibly a commodity. Public goods, such as defense for instance, are just one case of an ill-fit with the market mechanism, even though it can serve a subsidiary role as a government purchases goods on the market to be used by the military. Similarly, the market mechanism can be used only in a limited way in regard to public colleges and universities.

While the market mechanism can be useful as colleges and universities compete with each other for faculty as well as students, this does not require that the students themselves foot the entire bill. There are non-price competitive elements, such as the quality of a faculty. Moreover, if Adams and Jefferson were correct, requiring students to pay so much tuition subsidizes people who do not go to college but reap the more general benefits of living in a free society (which benefits from its educated inhabitants).

The Times reports that economists “have found that higher education benefits communities even more than it benefits the individual receiving the degree. Studies show that an educated populace leads to faster economic growth and a more stable democracy, and benefits the poorest workers the most. The post World War II economic boom, for example, has been attributed to increased college enrollment thanks to the G.I. Bill.” To the extent that society as a whole benefits from there being educated citizens in it, then those citizens who are students at public universities should not have to foot all of the bill. It simply is not fair, or just (see Rawls’ theory of justice), even if such students are effectively subsidizing society’s poor.

A government has a rational interest in making a higher education affordable for its citizens who qualify educationally. The extent to which society as a whole benefits from the educated within it justifies the public funding. This is not to say that every citizen has an entitlement to go to an Ivy League university. However, just as in health-insurance it is possible to guarantee every citizen a basic coverage while the wealthier are free to buy “Cadillac plans” for additional coverage, governments in America could offer any educationally qualified citizen a significant subsidy in attending one of the “in-state” public universities because of the benefit to society. Of course, in a federal system the matter is complicated by the question of which scale of society can be expected to benefit, though in the American case the spending clause of the U.S. Constitution, while being for the general welfare, may be limited to the enumerated powers listed for Congress. In any case, I think Europe is barking up the wrong tree in following America in relying more on tuition at state universities.

Source:
Catherine Rampell, “Where the Jobs Are, the Training May Not Be,” The New York Times, March 2, 2012.

Monday, February 3, 2014

An Identity Crisis Grips American Higher Education

In 2012, the average annual cost of attending a four-year public university was $15,100 and a private university was $32,900. The cost had risen 1,120% since 1978—four times the increase in the consumer price index. Student debt had also increased—to a trillion dollars (more than auto or credit-card debt). One in five households had student debt—a third of which was in default. In a poll by the Carnegie Corp. and Times magazine, 96% of senior higher education administrators said the sector is in crisis—40% saying the crisis was severe (as if there were any other kind). Crises, at least in America, have a tendency to go on, even becoming the hackneyed status-quo.
Gene Budig, whom I remember as the chancellor at my first university while I was enrolled as a young lad, points in an editorial to “skill mismatches” and cost-containment problems as the underlying culprits in the crisis. I contend that in doing so, he conflates education with training regarding mismatches and oversimplifies the cost problem in that it could also apply to business just as well. Reading his opinion piece reinforced, or explained, the sense I had had as a student that Budig’s last university was substandard, even marginal, in an academic sense.

 Gene A. Budig, former Chancellor of the University of Kansas (a third-tier school with an understandable inferiority complex)  (Image Source: College Advocacy Board)
Budig claims that colleges and universities “need to synchronize their work with businesses.” For example, Microsoft estimated that between 2010 and 2020, the American economy would annually add more than 120,000 additional computing jobs. In 2012, American colleges and universities produced only 40,000 bachelor’s degrees in computer science. However, it is difficult to construe how full employment or industrial policy is the obligation of colleges and universities rather than governments. Furthermore, the proliferation of professional schools at universities suggests that any mismatch based on skill is hardly behind the crisis in American higher education. In fact, the movement of American professional schools away from education in order to emphasize skills may be a contributing problem. American higher education may have lost its moorings and thus wandered off to functions at odds with itself. To put it simply, education is not vocation. Theory and reasoning do not willow down to critical thinking, as in problem-solving, and praxis.
Not surprisingly, the business school at Budig’s last university was all about skills. Although the skills taught in the auditing class bore little resemblance to the actual tasks in an audit (implying that more of a match would solve the problem), it could also be argued that the business school was in a virtual no-man’s land between vocation and education. In terms of skills, the technical orientation missed the rather obvious matter of how the conflict of interest between CPA firms and their clients is part of the audit itself. In short, the school failed at both rather than being merely insufficient in meeting the firms’ needs.

Notably, the dean at my next business school told the new MBA students on our very first day that we would not use what we learn at his school in business for at least ten or fifteen years. The content of the courses was not geared to the immediate needs of business. Rather, an understanding of business itself, its disciplines, and its relation to its environment comes into play, he explained, only at senior levels of management, where conceptual understanding  may even be more important than skill. That dean’s orientation is in line with a business school being part of a university rather than a vocational training institute. Budig, in contrast, views colleges and universities through the lenses of job training. Accordingly, he misconstrues higher education itself into something it is not. Losing an understanding of what education is permits it to be used in ways that are at odds with what it is. In other words, if universities gravitate increasingly toward providing skills useful to employers, then the provision of knowledge will suffer because other institutions in society are not going to pick up the slack on what is academia's native turf. A double-shot of training (at school and at work) leaves a void as far as higher education is concerned.
As an alternative to trying to be a corporation, law firm or hospital, a university could focus on what only it can do from its own vantage-point. Namely, while other institutions must be oriented to daily tasks, the classroom can afford students with a unique perspective—that of understanding via theory and reasoning. It is no accident that colleges have had scholars on faculty. A person working in a corporation is unlikely to get much on why it is that a human organization functions at all. A person is also unlikely to bet the opportunity to theorize on the history of American constitutional philosophy while preparing for a case at a law firm. Pondering the nature of disease is thankfully sidelined at a hospital as physicians try to save lives. Understood rightly, a college education is a luxury.
Indeed, we have lost touch with the historical role of a college education as a respite (i.e., break) from the office during which one can satisfy one’s intellectual curiosities in becoming educated. Budig’s claim that universities should do more to satisfy business skill flies in the face of the basis of higher education. It is no wonder that it is foundering under its own weight.
The heavy weight manifests in one major way through cost increases far beyond inflation. Although Budig urges cost-containment, he treats it almost as if it were simply a business problem. He ignores the sector’s unique aspects. For example, the rather unique ideological culture at colleges and universities enables the proliferation of academic programs, majors, and staff offices geared to particular causes.
Student housing has become uncompetitive in price due in part to the proliferation of programs, and thus staff, that are only tangentially related to providing housing. Insufficiently moored in established disciplines, many schools taking students as customers have created additional majors to satisfy consumer demand. Here again, the concept of student is being misconstrued using a paradigm from another sector—that of business.
Even the concept of scholar is increasingly being misconstrued. At “universities” like Walden and the University of Phoenix, faculty are regarded by administrators as employees whose content can be overseen by non-experts. This essentially de-professionalizes professors. It should be no surprise that the “product” being taught is inferior at such profit-centers. Even at traditional colleges, the distinction between lecturing and research has been blurred or misunderstood.
To reduce senior professors’ teaching loads while hiring adjunct (i.e., part-time, mostly non-academics) to make up the difference on the teaching can be challenged on two fronts. First, far from being a hindrance, giving lectures (literally, “professing”) can be a welcome break to scholars otherwise engrossed in the labors of writing or lab work. To be sure, the grading of large classes should be done by graduate students, given the value of a full professor’s time. To reduce a senior scholar's teaching load is quite another thing. Expanding the lecturing obligation (with assistants handling the administration and grading) to go along with two seminars a year  in the scholar's sub-discipline would effectively capitalize on the mature knowledge on campus  without over taxing it. Promising senior scholars one class per term is at the very least counter-productive from an academic standpoint, as the distribution of seasoned knowledge is minimized. Even if the perk would seem to be necessary to gain "the best and the brightest" according to some human resource calculus, perhaps the applicants wanting to teach only one class per term are not the best and the brightest after all.

Additionally, universities could hire more full-time lecturers (i.e., teaching only) rather than add more higher-salaried professors to fill teaching loads (even assuming tno “teaching credit” is given for research). The lecturer-professor distinction is unique to academia, so it should be no surprise that the identity crisis plaguing higher education has taken a toll in terms of managing the costs of teaching with respect to maintaining an optimal balance of lectuers and professors.
Cost-containment in higher education has also been undone by the artificial (and unique) “market” of guaranteed (and subsidized) student loans. It has been all too easy for financial aid offices to max out the students’ allowable debt as tuition and fees have not coincidently risen. In effect, the increasing budgets of colleges and universities have been enabled by a known pool of debt. It is of no concern that not even bankruptcy can relieve an unemployed alumnus of the crushing psychological and financial burden that comes with de facto default. Lest it be argued that less of a mismatch of skill would solve this problem, one might consider the less-than full-employment equilibrium that has become the default in industrialized economies. Even to link student loans to eventual employment implies the category mistake wherein education is rendered as essentially job training.
Having been the president or chancellor of three universities, Gene Budig can be taken as representative of the perspective of experienced governance in higher education in the United States. From my standpoint, that perspective has succumbed to a dominant value in American society—a value expressed in the works of William James and the orientation of Ben Franklin toward praxis. Americans are known to laud “movers and shakers” rather than philosophers and artists. Europeans are perhaps more balanced in this regard. Rather than contort universities into full-employment engines, as if knowledge has value only to the extent it is useful in someone’s job, vocational training institutes could be created, leaving education to the colleges and universities, where even the professional schools would be free to be knowledge rather than skill-based. This is already the case at the professional schools at the top of the Ivy League. Sadly, the message has not percolated through the system. In fact, most universities actually regard the undergraduate degrees in the professional schools as doctorates!
In conclusion, Gene Budig evinces what is wrong with higher education, rather than how it can be fixed. Fundamentally, to be in line with one’s nature is to permit one to excel, whereas to be something at odds with what one is can spell disaster. This lesson could hardly come from skill and practice. It is ironic that schools most oriented to skills may still be quite distant from what the practitioners need. As much as the schools’ faculties want to pretend to inhabit corporations, hospitals, and law firms respectively, rest assured that the schools are on the university side of the divide and thus considerable distance with respect to “meeting needs” of “the real world” is inevitable. Yet the venture effectively stifles the schools’ participation in the “knitting” of academic life. The result is “crisis” in the sense of an identity crisis. In effect, higher education has lost touch with itself as it has unwittingly swallowed one of the dominant values in American society. Nietzsche writes that a philosopher cannot be a person of his or her own day. Might this also be true of academia—and not as a vice or drawback, but, rather, as a diamond that has been inadvertently covered over in the rough.

Source:

Gene A. Budig, “Beating America’s College Crisis,” USA Today, January 7, 2013.

Decadence at Yale: Justice Thomas

For the first time since 2006, Justice Clarence Thomas of the U.S. Supreme Court spoke during oral argument on January 14, 2013. Even though a quip made by Justice Scalia prompted the stealth Justice to reply, the content is nonetheless quite revealing concerning Justice Thomas's quite understandable attitude concerning Yale Law School, one of his alma maters.

The case before the court concerned a defendant from Louisiana seeking to have his murder conviction overturned. The case involved the Sixth Amendment right to a speedy trial. Backing Louisiana’s position that the defendant had received an adequate defense, Justice Scalia sought to extol the qualifications of the defendant’s lawyers; the Council of Louisiana had cited Yale Law School as "evidence" of the high competence of at least one of the defendant's lawyers. At that point, several people present in the room heard Justice Thomas remark, “Well, he did not have competent council, then.” Thomas was not entirely joking.
                                                                       Image source: time.com
Trying to account for Thomas's rare breach of his long-kept silence on the bench, the Wall Street Journal points to Yale’s emphasis on affirmative action as having embarrassed Clarence Thomas as a law student benefiting from the program in the 1970s. In his memoir, My Grandfather’s Son, Thomas writes  of his fear back then that other students might assume he had been admitted because he is black rather than because of any of his personal accomplishments. “As a symbol of my disillusionment," he wrote in his memoir, "I peeled a 15-cent sticker off a package of cigars and stuck it on the frame of my law degree to remind myself of the mistake I’d made by going to Yale. . . . I never did change my mind about its value.” Add in the fact that he would store his Yale diploma in his basement instead of displaying it in his chambers, and his castigation of his diploma's value suddenly looks like more than just opposition to affirmative action.

Although not a law student at Yale, I did take and sit in on some law classes on account of the amount of historical political and economic thought in them. Being a Midwesterner, I was not used to the intensity of passive aggression (not to mention raw anger) especially in the law school when it came to enforcing "political correctness." Had Thomas even sneezed any contempt for affirmative action, the "sharks" would have been at him in an instant. I made a comment in a class one time in support of Scalia and Thomas on federalism only to find that a feminist law student had "retaliated" against me by telling an academic administrator that I had tripped her. The administrator, who had a law degree and thus had surely taken evidence, apparently did not need any. As a white male, I must have tripped the black female. Thank God I was not a law student at Yale! Even so, my diploma is in storage.


Source:

Jess Bravin, “Seven-Year Itch: Thomas Breaks Silence,” The Wall Street Journal, January 14, 2013.

Pragmatism Prevails at American Colleges

From 2010 through 2012, freshman enrollment at more than 25 percent of American 4-year private colleges declined 10 percent or more; from 2006 through 2009, fewer than one and five such schools had suffered a similar decline.[1] Georgian Court “University” in New Jersey saw its entering class shrink by a third in 2012. Rockford College, the administration of which had foolishly spent the college’s entire endowment to buy a college in Europe only to sell it without realizing much if any financial gain, “re-invented” the college as a university in 2013. The name-only change made it possible for more foreign students aided by their respective governments to attend. To be sure, hubris was also palpable in the motivation, particularly as the college was still a college on the ground, such as in the insistence that locals use the word UNIVERSITY. In short, the colleges having distant orbits from the academic sun self-identified themselves by their own desperate measures. The proof, as they say, is in the pudding.
More than one factor likely contributed to the declining trend pertaining to the small 4-year colleges. In this essay, I bring out a rather subtle contributor.
First, the usual suspects. College costs, and thus tuition, were increasing at triple the rate of inflation. Academics, at least those without business experience or a M.B.A., may not be equipped to manage a college efficiently. For example, how many colleges hire lecturers to teach the basic courses, reducing the payroll of professors? Additionally, how many colleges encourage faculty to video-tape lectures for the students to watch on-line so class sessions can concentrate on problem-solving (e.g., mathematics) and answering questions? Each faculty member would be able to teach more courses per term, hence lowering the faculty payroll.
Another factor typically in the media is the onslaught of lower-cost online courses especially at the “on-line universities” such as the University of Phoenix. The number of Americans taking at least one course on-line increased 45 percent between 2008 and 2013.[2] Although administrators at some traditional “brick and mortar” colleges were adding on-line course options, the increase of 45 percent generally put increasing pressure on courses that are delivered traditionally rather than on-line. Why pay so much more if the learning outcome is the same? Or is it? Do we know, particularly if a moving target is involved?
Lest it be thought that changing democraphics—fewer people entering college following the baby-boomers’ children—account for the decline, the on-line-oriented for-profit “universities” saw greatly expanding enrollment numbers. This is not to say that this factor is a dead-end. The very notion of a for-profit university oriented to delivering content in ways that are most convenient to students evinces a conflation of vocationalism and education—skill and knowledge, respectively. Whereas the former answers how to questions, the latter explains and thus is oriented to mostly to why questions. On-line “universities” were able to leverage the credibility in educational institution while using technology fit particularly for skills.
Moreover, the value of a college degree was increasingly based on vocational criteria. According to the Wall Street Journal, “questions about a college degree’s value” were “challenging centuries-old business models.”[3] In other words, the lack of any good job prospects for graduating seniors was assumed to mean that the college degree had lost value. That a college might exist to answer intellectual curiosity and, moreover, enable people to be educated had been largely forgotten. The value of the practical in American society had finally interlarded itself in “higher” education. What would Jefferson and Adams, who agreed that a virtuous and educated citizenry is vital to a viable republic, think?




1. Douglas Belkin, “Private Colleges Squeezed,” The Wall Street Journal, November 9-10, 2013.
2. Ibid.
3. Ibid.