Saturday, August 16, 2014

Business Culture Forming Higher Education

“Publish or perish” is the infamous mantra of those intrepid scholars who work at research universities and many prestigious Liberal Arts & Sciences colleges dotting the map of the world. The need to demonstrate regular output is perhaps nowhere more stressed (hence, stress) than in the United States. As if the declining number of tenure positions (amid increasing reliance on adjuncts, not coincidentally) at colleges and universities in the U.S. were not enough of a challenge for the newly-minted doctors aspiring to the intellectual freedom that goes with the protection of tenure, that the young scholars are increasingly being subjected to an "assembly-line" process wherein faculty administrators treat their junior colleagues' published journal articles like chocolates on a conveyer belt puts scholarship at odds with itself and thus is utterly self-defeating from the standpoint of society gaining new knowledge.

Notre Dame’s business school, for example, requires two academic articles per year of its scholars who come up for tenure. That particular regiment teems with the oily scent of production management awkwardly being applied to a sort of ethereal assembly line that mechanistically combines ideas at regular intervals. As if subprime mortgages going into financial derivative securities (i.e., bonds), the compound ideas are themselves bundled into at least two major “salable products” per year, like clockwork. Such a rationalized, linear process hardly befits the mind of a scientist, social “scientist,” or philosopher. The process of ideational development (i.e., theory construction) can require sudden bursts of inspiration and imagination, stimulated by some good old-fashioned mind-wandering aided by the gentle push of reasoning.[1]

 Can scholarship be "produced" as though the ideas were on an assembly line?    
Image Source: www.prophecy1.com

Cramming a scholar’s methodology through a “meat-grinder” business model is like forcing a square peg into a round hole. Superimposing “cookie-cutter” production management tools onto scholarship can be expected to result in mediocre work that only incrementally adds to an existing paradigm. Put another way, the approach is not apt to result in another Kant or Einstein, and that's a pity considering how much computer technology could facilitate the scholarship of a great mind.

Enabling the category mistake, not just a few business schools pretend to be businesses, trimming the fecundity of true scholarship off like scrap metal in a factory. Primped-up grizzled suits (i.e., professors who play the part of “corporate executive” aided by a corporate script) pass like whispers through muffled corridors of fresh carpet to train future expert-consultants and managers with the aid of easily-memorizable power-point presentations standing in for lectures. Returning to well-ordered offices with comfortable chairs and only a few carefully stacks of papers on the desk-tops, the “scholars” write what they hope will be pleasing to the practitioners who read the “professional” journals and hire consultants.

With the deepening and widening of human knowledge being presumed readily regularizable temporally into distinct units of production, moreover, the work of a university employee ostensibly tasked with scholarship succumbs to the hegemony of business values. Those include the sheer constancy of monetized worth, as effected, for instance, by the regular output of ideas or automobiles. Moreover, monetization is held aloft as the criterion, fully capable of turning every thing (and person) into a commodity.

I suspect that people around the world, particularly in Europe, would graciously point to the hypertrophic importance that Americans attach to barren metal (i.e., money) and business mores or practices (e.g., being “professional”). In fact, this over-reach of business values in American culture may be spreading to Europe in the guise of avant-garde trends.

Months before the arrival of the turbulent twentieth century, John Watson (1850-1907), a European theologian and clergyman publishing under the pseudonym of Ian Maclaren, put his prescient observations of American culture to paper:  


“The friendly visitor to the United States, who is proud of her achievements and delighted by her brightness, stands aghast at the open and unabashed front of secularity. It seems to him as if not merely coarse and unlettered men, whose souls have never been touched, either by religion or by culture, but that all men, with a few delightful exceptions, bow the knee to this golden calf, and do it homage. Nowhere is there such constant and straightforward talk about money, nowhere is such importance attached to the amount of money which a man has acquired or possesses, nowhere is it taken so absolutely for granted that the object of a man’s work is to obtain money, and that if you offer him money enough he will be willing to do any work which is not illegal; that, in short, the motive power with almost every man is his wages. One is struck, not so much by what is said in plain words (although a dollar is a monotonous refrain in conversation), as by what is implied, and what is implied is this: that if you know the proper sum, any man can be induced to do what you want, even though his health, and his rest, and his family, and his principles, stand in the way.”[2]

More than a century later, European society would be more secular than its American counterpart. Proportionately, many more Americans attend a religious service on a weekly basis. Additionally, the explosion of mega-churches in several of the U.S. states did not translate into a corresponding “growth” in E.U. states. Even so, Watson’s description of the American obsession with income and wealth as necessary and decisive in the valuing of others and oneself resonates with what I have observed. Judging from the reactions of people from abroad, I get the sense that many Americans—especially those working in the business world—do indeed go overboard rather uniquely in reducing personal value to that which can be measured by how much money someone has in a bank account. Just as Watson intended to help rather than beat up on American society, I too want to see more balance as befits the heterogeneity of values that is at home in human nature.


1. Hence the “absent-minded professor” label.
2. Ian Maclaren, “The Shadow on American Life: An Impression of a Recent Visit,” The Outlook, 63 (September 9, 1899), pp. 116-18.

Tuesday, February 11, 2014

Teaching Business Ethics Beyond Skill and Tactic

At the end of March 2010, Warren Buffet spoke to business students at Columbia University.  One asked the billionaire whether being ethical in business comes from how a person is raised or out of a business ethics course.  Buffet quickly answered that it is learned in the home. I concur.

A business ethics course properly understood is not prescriptive; rather, knowledge of ethical principles is learned and applied to topics in business as well as to business systems (domestically and comparative). Although understanding itself can change behavior, academic study is not geared to making a student more ethical. The focus, in other words, of an instructor or professor of business ethics ought to be on imparting knowledge rather than skills that somehow turn people into ethical human beings.

Following bullet-points on how to make ethical decisions, for example, does not make one ethical, just as the manipulation of symbols according to rules does not constitute understanding. Rather, for a person who is inclined to consider the ethical dimension at work or more generally, knowing more about ethics can come into play.

It follows that business ethics is not a skill or a list of tactics; education is not vocation. Paradoxically, understanding ethical principles and explaining their relevance to business is of value not only academically in terms of a college education, but also practically in terms of a career. In other words, knowing more about something—being able to explain it—is best for one’s practice.

For example, explaining why (not just how) there is a the structural conflict of interest in the rating agencies getting paid by the issuers—indeed, knowing what a structural conflict of interest is—facilitates recognizing such conflicts when they occur. Such a recognition is essential for a business practitioner who wants to make sense of an uncomfortable situation that is in actuality an institutional or individual conflict of interest. Moreover, if more of a given electorate understand and can recognize conflicts of interest, it is more likely that the elected representatives will enact legislation that obviates such conflicts. A viable republic is predicated on an educated and virtuous electorate.

That the Dodd-Frank Act of financial reform retains the issuer-pays system—relying almost exclusively on the rating agencies’ internal controls to somehow counter the temptations from the system itself—may mean that neither the general public nor the members of Congress sufficiently understood the nature of a structural conflict of interest when the law was written and enacted. Had more constituents understood the structural depth of the conflict in the issuer-pays arrangement itself, perhaps Sen. Dodd and Rep. Frank would have been able to resist the pressure from Wall Street and the rating agencies in order to root out the underlying cause of the conflict of interest.

Obviating the conflict of interest involving the rating agencies may not be as easy as it seems; even an investor-pays arrangement could involve a conflict of interest in that rating agencies would have an interest in rating an issue such that enough investors purchase it that the agency can realize a profit. Analyzing various alternatives for indications of institutional conflicts of interest would be a good means for students of business ethics to gain a better understanding of the nature of an institutional conflict of interest.

In short, short-circuiting knowledge by reducing education to vocation is paradoxically not in the interest of business; a business school in a university is not a corporation’s training center. Therefore, business professors and instructors should not conflate what they are with what they are studying and teaching. Even in terms of consulting, coming from a vantage-point that does not duplicate a corporate perspective (yet can relate knowledge to it) can be particularly valuable to managers. Ethically speaking, the most expedient route is not always the best of all possible worlds.

Higher Education: A Public Good or Commodity?

According to the New York Times, the American states, much more so than their European counterparts, began a gradual trend of what the paper calls “withdrawal” from higher education during the recession in the early 1990s. It is perhaps more accurate to say “decreased financial support” as the governments have not cut off their own universities. Even so, the change has been significant. The Times reports that the governments’ appropriations for their colleges and universities decreased by 7.6% in 2011-2012. This represents the largest annual decline in at least the previous fifty years—since the time of JFK—according to a report issued by Illinois State University. As of 2012, Arizona had decreased its higher education budget by 31 percent since the recession began in 2007. The housing market decline hit Arizona hard. I would not be surprised to find similar numbers in California, Florida and Illinois. California’s decreased spending on education in general was well publicized in 2011 by Jerry Brown, the head of state and chief executive.

As a result, even as universities have found ways to hold costs per student relatively steady—hovering around an inflation-adjusted $10,000 in the 1985-2012 period—the share of instruction costs paid for by tuition (aside from financial aid!) nearly doubled to 40 percent from 23 percent. The head of the University of California at Santa Cruz admitted that the reality is that students are paying more and getting less.


The trend evinces a shift from the belief that societies (or polities) benefit from higher education to a belief that the people receiving the education benefit primarily and thus should foot the bill, according to a trustee of the State University of New York. In their letters, John Adams and Thomas Jefferson agreed that an educated and virtuous citizenry is vital to a republic. By educated, they did not just mean in civics. Indeed, the ensuing tradition in American higher education has been that a broad foundation in the liberal arts and sciences should be in addition to, rather than merely in lieu of, an undergraduate degree in a professional school such as law, medicine, education, or business. As a result, an American physician or lawyer has two undergraduate degrees, whereas his or her European counterpart has only an undergraduate degree in medicine or law. The relative emphasis on liberal arts and sciences in America reflects the view of Adams and Jefferson that societies not only benefit from a broadly educated citizenry, but if the form of government is that of representative democracy (i.e., a republic), then such education is necessary. The shift in belief toward higher education as a product to be purchased like a car or soap goes against the tenet of the Founding Fathers and thus does not bode well for the continued viability of the American republics in the twenty-first century.

I would add an economic argument that treating healthcare or years of full-time higher education as though they were products to be purchased on the market like an ipod or smartphone misapplies the market mechanism because of the huge gap between the prices of healthcare and education and the wherewithal of the typical consumer to cover them financially, even in the form of debt. Just because communism fell to capitalism does not mean that absolutely everything is justly or even feasibly a commodity. Public goods, such as defense for instance, are just one case of an ill-fit with the market mechanism, even though it can serve a subsidiary role as a government purchases goods on the market to be used by the military. Similarly, the market mechanism can be used only in a limited way in regard to public colleges and universities.

While the market mechanism can be useful as colleges and universities compete with each other for faculty as well as students, this does not require that the students themselves foot the entire bill. There are non-price competitive elements, such as the quality of a faculty. Moreover, if Adams and Jefferson were correct, requiring students to pay so much tuition subsidizes people who do not go to college but reap the more general benefits of living in a free society (which benefits from its educated inhabitants).

The Times reports that economists “have found that higher education benefits communities even more than it benefits the individual receiving the degree. Studies show that an educated populace leads to faster economic growth and a more stable democracy, and benefits the poorest workers the most. The post World War II economic boom, for example, has been attributed to increased college enrollment thanks to the G.I. Bill.” To the extent that society as a whole benefits from there being educated citizens in it, then those citizens who are students at public universities should not have to foot all of the bill. It simply is not fair, or just (see Rawls’ theory of justice), even if such students are effectively subsidizing society’s poor.

A government has a rational interest in making a higher education affordable for its citizens who qualify educationally. The extent to which society as a whole benefits from the educated within it justifies the public funding. This is not to say that every citizen has an entitlement to go to an Ivy League university. However, just as in health-insurance it is possible to guarantee every citizen a basic coverage while the wealthier are free to buy “Cadillac plans” for additional coverage, governments in America could offer any educationally qualified citizen a significant subsidy in attending one of the “in-state” public universities because of the benefit to society. Of course, in a federal system the matter is complicated by the question of which scale of society can be expected to benefit, though in the American case the spending clause of the U.S. Constitution, while being for the general welfare, may be limited to the enumerated powers listed for Congress. In any case, I think Europe is barking up the wrong tree in following America in relying more on tuition at state universities.

Source:
Catherine Rampell, “Where the Jobs Are, the Training May Not Be,” The New York Times, March 2, 2012.

Monday, February 3, 2014

An Identity Crisis Grips American Higher Education

In 2012, the average annual cost of attending a four-year public university was $15,100 and a private university was $32,900. The cost had risen 1,120% since 1978—four times the increase in the consumer price index. Student debt had also increased—to a trillion dollars (more than auto or credit-card debt). One in five households had student debt—a third of which was in default. In a poll by the Carnegie Corp. and Times magazine, 96% of senior higher education administrators said the sector is in crisis—40% saying the crisis was severe (as if there were any other kind). Crises, at least in America, have a tendency to go on, even becoming the hackneyed status-quo.
Gene Budig, whom I remember as the chancellor at my first university while I was enrolled as a young lad, points in an editorial to “skill mismatches” and cost-containment problems as the underlying culprits in the crisis. I contend that in doing so, he conflates education with training regarding mismatches and oversimplifies the cost problem in that it could also apply to business just as well. Reading his opinion piece reinforced, or explained, the sense I had had as a student that Budig’s last university was substandard, even marginal, in an academic sense.

 Gene A. Budig, former Chancellor of the University of Kansas (a third-tier school with an understandable inferiority complex)  (Image Source: College Advocacy Board)
Budig claims that colleges and universities “need to synchronize their work with businesses.” For example, Microsoft estimated that between 2010 and 2020, the American economy would annually add more than 120,000 additional computing jobs. In 2012, American colleges and universities produced only 40,000 bachelor’s degrees in computer science. However, it is difficult to construe how full employment or industrial policy is the obligation of colleges and universities rather than governments. Furthermore, the proliferation of professional schools at universities suggests that any mismatch based on skill is hardly behind the crisis in American higher education. In fact, the movement of American professional schools away from education in order to emphasize skills may be a contributing problem. American higher education may have lost its moorings and thus wandered off to functions at odds with itself. To put it simply, education is not vocation. Theory and reasoning do not willow down to critical thinking, as in problem-solving, and praxis.
Not surprisingly, the business school at Budig’s last university was all about skills. Although the skills taught in the auditing class bore little resemblance to the actual tasks in an audit (implying that more of a match would solve the problem), it could also be argued that the business school was in a virtual no-man’s land between vocation and education. In terms of skills, the technical orientation missed the rather obvious matter of how the conflict of interest between CPA firms and their clients is part of the audit itself. In short, the school failed at both rather than being merely insufficient in meeting the firms’ needs.

Notably, the dean at my next business school told the new MBA students on our very first day that we would not use what we learn at his school in business for at least ten or fifteen years. The content of the courses was not geared to the immediate needs of business. Rather, an understanding of business itself, its disciplines, and its relation to its environment comes into play, he explained, only at senior levels of management, where conceptual understanding  may even be more important than skill. That dean’s orientation is in line with a business school being part of a university rather than a vocational training institute. Budig, in contrast, views colleges and universities through the lenses of job training. Accordingly, he misconstrues higher education itself into something it is not. Losing an understanding of what education is permits it to be used in ways that are at odds with what it is. In other words, if universities gravitate increasingly toward providing skills useful to employers, then the provision of knowledge will suffer because other institutions in society are not going to pick up the slack on what is academia's native turf. A double-shot of training (at school and at work) leaves a void as far as higher education is concerned.
As an alternative to trying to be a corporation, law firm or hospital, a university could focus on what only it can do from its own vantage-point. Namely, while other institutions must be oriented to daily tasks, the classroom can afford students with a unique perspective—that of understanding via theory and reasoning. It is no accident that colleges have had scholars on faculty. A person working in a corporation is unlikely to get much on why it is that a human organization functions at all. A person is also unlikely to bet the opportunity to theorize on the history of American constitutional philosophy while preparing for a case at a law firm. Pondering the nature of disease is thankfully sidelined at a hospital as physicians try to save lives. Understood rightly, a college education is a luxury.
Indeed, we have lost touch with the historical role of a college education as a respite (i.e., break) from the office during which one can satisfy one’s intellectual curiosities in becoming educated. Budig’s claim that universities should do more to satisfy business skill flies in the face of the basis of higher education. It is no wonder that it is foundering under its own weight.
The heavy weight manifests in one major way through cost increases far beyond inflation. Although Budig urges cost-containment, he treats it almost as if it were simply a business problem. He ignores the sector’s unique aspects. For example, the rather unique ideological culture at colleges and universities enables the proliferation of academic programs, majors, and staff offices geared to particular causes.
Student housing has become uncompetitive in price due in part to the proliferation of programs, and thus staff, that are only tangentially related to providing housing. Insufficiently moored in established disciplines, many schools taking students as customers have created additional majors to satisfy consumer demand. Here again, the concept of student is being misconstrued using a paradigm from another sector—that of business.
Even the concept of scholar is increasingly being misconstrued. At “universities” like Walden and the University of Phoenix, faculty are regarded by administrators as employees whose content can be overseen by non-experts. This essentially de-professionalizes professors. It should be no surprise that the “product” being taught is inferior at such profit-centers. Even at traditional colleges, the distinction between lecturing and research has been blurred or misunderstood.
To reduce senior professors’ teaching loads while hiring adjunct (i.e., part-time, mostly non-academics) to make up the difference on the teaching can be challenged on two fronts. First, far from being a hindrance, giving lectures (literally, “professing”) can be a welcome break to scholars otherwise engrossed in the labors of writing or lab work. To be sure, the grading of large classes should be done by graduate students, given the value of a full professor’s time. To reduce a senior scholar's teaching load is quite another thing. Expanding the lecturing obligation (with assistants handling the administration and grading) to go along with two seminars a year  in the scholar's sub-discipline would effectively capitalize on the mature knowledge on campus  without over taxing it. Promising senior scholars one class per term is at the very least counter-productive from an academic standpoint, as the distribution of seasoned knowledge is minimized. Even if the perk would seem to be necessary to gain "the best and the brightest" according to some human resource calculus, perhaps the applicants wanting to teach only one class per term are not the best and the brightest after all.

Additionally, universities could hire more full-time lecturers (i.e., teaching only) rather than add more higher-salaried professors to fill teaching loads (even assuming tno “teaching credit” is given for research). The lecturer-professor distinction is unique to academia, so it should be no surprise that the identity crisis plaguing higher education has taken a toll in terms of managing the costs of teaching with respect to maintaining an optimal balance of lectuers and professors.
Cost-containment in higher education has also been undone by the artificial (and unique) “market” of guaranteed (and subsidized) student loans. It has been all too easy for financial aid offices to max out the students’ allowable debt as tuition and fees have not coincidently risen. In effect, the increasing budgets of colleges and universities have been enabled by a known pool of debt. It is of no concern that not even bankruptcy can relieve an unemployed alumnus of the crushing psychological and financial burden that comes with de facto default. Lest it be argued that less of a mismatch of skill would solve this problem, one might consider the less-than full-employment equilibrium that has become the default in industrialized economies. Even to link student loans to eventual employment implies the category mistake wherein education is rendered as essentially job training.
Having been the president or chancellor of three universities, Gene Budig can be taken as representative of the perspective of experienced governance in higher education in the United States. From my standpoint, that perspective has succumbed to a dominant value in American society—a value expressed in the works of William James and the orientation of Ben Franklin toward praxis. Americans are known to laud “movers and shakers” rather than philosophers and artists. Europeans are perhaps more balanced in this regard. Rather than contort universities into full-employment engines, as if knowledge has value only to the extent it is useful in someone’s job, vocational training institutes could be created, leaving education to the colleges and universities, where even the professional schools would be free to be knowledge rather than skill-based. This is already the case at the professional schools at the top of the Ivy League. Sadly, the message has not percolated through the system. In fact, most universities actually regard the undergraduate degrees in the professional schools as doctorates!
In conclusion, Gene Budig evinces what is wrong with higher education, rather than how it can be fixed. Fundamentally, to be in line with one’s nature is to permit one to excel, whereas to be something at odds with what one is can spell disaster. This lesson could hardly come from skill and practice. It is ironic that schools most oriented to skills may still be quite distant from what the practitioners need. As much as the schools’ faculties want to pretend to inhabit corporations, hospitals, and law firms respectively, rest assured that the schools are on the university side of the divide and thus considerable distance with respect to “meeting needs” of “the real world” is inevitable. Yet the venture effectively stifles the schools’ participation in the “knitting” of academic life. The result is “crisis” in the sense of an identity crisis. In effect, higher education has lost touch with itself as it has unwittingly swallowed one of the dominant values in American society. Nietzsche writes that a philosopher cannot be a person of his or her own day. Might this also be true of academia—and not as a vice or drawback, but, rather, as a diamond that has been inadvertently covered over in the rough.

Source:

Gene A. Budig, “Beating America’s College Crisis,” USA Today, January 7, 2013.

Decadence at Yale: Justice Thomas

For the first time since 2006, Justice Clarence Thomas of the U.S. Supreme Court spoke during oral argument on January 14, 2013. Even though a quip made by Justice Scalia prompted the stealth Justice to reply, the content is nonetheless quite revealing concerning Justice Thomas's quite understandable attitude concerning Yale Law School, one of his alma maters.

The case before the court concerned a defendant from Louisiana seeking to have his murder conviction overturned. The case involved the Sixth Amendment right to a speedy trial. Backing Louisiana’s position that the defendant had received an adequate defense, Justice Scalia sought to extol the qualifications of the defendant’s lawyers; the Council of Louisiana had cited Yale Law School as "evidence" of the high competence of at least one of the defendant's lawyers. At that point, several people present in the room heard Justice Thomas remark, “Well, he did not have competent council, then.” Thomas was not entirely joking.
                                                                       Image source: time.com
Trying to account for Thomas's rare breach of his long-kept silence on the bench, the Wall Street Journal points to Yale’s emphasis on affirmative action as having embarrassed Clarence Thomas as a law student benefiting from the program in the 1970s. In his memoir, My Grandfather’s Son, Thomas writes  of his fear back then that other students might assume he had been admitted because he is black rather than because of any of his personal accomplishments. “As a symbol of my disillusionment," he wrote in his memoir, "I peeled a 15-cent sticker off a package of cigars and stuck it on the frame of my law degree to remind myself of the mistake I’d made by going to Yale. . . . I never did change my mind about its value.” Add in the fact that he would store his Yale diploma in his basement instead of displaying it in his chambers, and his castigation of his diploma's value suddenly looks like more than just opposition to affirmative action.

Although not a law student at Yale, I did take and sit in on some law classes on account of the amount of historical political and economic thought in them. Being a Midwesterner, I was not used to the intensity of passive aggression (not to mention raw anger) especially in the law school when it came to enforcing "political correctness." Had Thomas even sneezed any contempt for affirmative action, the "sharks" would have been at him in an instant. I made a comment in a class one time in support of Scalia and Thomas on federalism only to find that a feminist law student had "retaliated" against me by telling an academic administrator that I had tripped her. The administrator, who had a law degree and thus had surely taken evidence, apparently did not need any. As a white male, I must have tripped the black female. Thank God I was not a law student at Yale! Even so, my diploma is in storage.


Source:

Jess Bravin, “Seven-Year Itch: Thomas Breaks Silence,” The Wall Street Journal, January 14, 2013.

Pragmatism Prevails at American Colleges

From 2010 through 2012, freshman enrollment at more than 25 percent of American 4-year private colleges declined 10 percent or more; from 2006 through 2009, fewer than one and five such schools had suffered a similar decline.[1] Georgian Court “University” in New Jersey saw its entering class shrink by a third in 2012. Rockford College, the administration of which had foolishly spent the college’s entire endowment to buy a college in Europe only to sell it without realizing much if any financial gain, “re-invented” the college as a university in 2013. The name-only change made it possible for more foreign students aided by their respective governments to attend. To be sure, hubris was also palpable in the motivation, particularly as the college was still a college on the ground, such as in the insistence that locals use the word UNIVERSITY. In short, the colleges having distant orbits from the academic sun self-identified themselves by their own desperate measures. The proof, as they say, is in the pudding.
More than one factor likely contributed to the declining trend pertaining to the small 4-year colleges. In this essay, I bring out a rather subtle contributor.
First, the usual suspects. College costs, and thus tuition, were increasing at triple the rate of inflation. Academics, at least those without business experience or a M.B.A., may not be equipped to manage a college efficiently. For example, how many colleges hire lecturers to teach the basic courses, reducing the payroll of professors? Additionally, how many colleges encourage faculty to video-tape lectures for the students to watch on-line so class sessions can concentrate on problem-solving (e.g., mathematics) and answering questions? Each faculty member would be able to teach more courses per term, hence lowering the faculty payroll.
Another factor typically in the media is the onslaught of lower-cost online courses especially at the “on-line universities” such as the University of Phoenix. The number of Americans taking at least one course on-line increased 45 percent between 2008 and 2013.[2] Although administrators at some traditional “brick and mortar” colleges were adding on-line course options, the increase of 45 percent generally put increasing pressure on courses that are delivered traditionally rather than on-line. Why pay so much more if the learning outcome is the same? Or is it? Do we know, particularly if a moving target is involved?
Lest it be thought that changing democraphics—fewer people entering college following the baby-boomers’ children—account for the decline, the on-line-oriented for-profit “universities” saw greatly expanding enrollment numbers. This is not to say that this factor is a dead-end. The very notion of a for-profit university oriented to delivering content in ways that are most convenient to students evinces a conflation of vocationalism and education—skill and knowledge, respectively. Whereas the former answers how to questions, the latter explains and thus is oriented to mostly to why questions. On-line “universities” were able to leverage the credibility in educational institution while using technology fit particularly for skills.
Moreover, the value of a college degree was increasingly based on vocational criteria. According to the Wall Street Journal, “questions about a college degree’s value” were “challenging centuries-old business models.”[3] In other words, the lack of any good job prospects for graduating seniors was assumed to mean that the college degree had lost value. That a college might exist to answer intellectual curiosity and, moreover, enable people to be educated had been largely forgotten. The value of the practical in American society had finally interlarded itself in “higher” education. What would Jefferson and Adams, who agreed that a virtuous and educated citizenry is vital to a viable republic, think?




1. Douglas Belkin, “Private Colleges Squeezed,” The Wall Street Journal, November 9-10, 2013.
2. Ibid.
3. Ibid.

Thursday, April 5, 2012

Experts on the Supreme Court: Lawyers Who Teach

For all the American lawyers and law “professors” who had been predicting on the basis of their "expertise" that the three days of oral arguments before the U.S. Supreme Court meant that the Affordable Healthcare Act would go down, the Court's decision must have been a rude awakening. Immediately after the ruling, the decision came "as something of a surprise after the generally hostile reception the law received during the six hours of oral arguments."[1] This is an understatement at the very least.

Lest the public be taken in by the predictions of judicial "experts" in the future, we might want to recalibrate just how much insight the so-called experts really have on the inner workings of the U.S. Supreme Court.

           This public face of the U.S. Supreme Court may be distinct from what goes on 
behind the curtain.      (NYT)

Pete Williams, a journalist, reported on NBC Nightly News after one of the days of oral arguments that “Obamacare” was in trouble. Unless Williams had clerked at the Court, it is unlikely that he knew the Court’s inner workings, not to mention how the justices use oral arguments. For example, a justice might use the arguments to test out legal theories. Because I have not clerked at the Court, have no knowledge of its inner workings, and cannot get into a justice’s head, my projection out from the arguments can only be conjecture. Sadly, the journalists covering the case showed no such hesitancy concerning their own knowledge of the court and, indeed, the oral arguments themselves.

Lest we turn to American law “professors” as experts having even more insight into the Court’s workings than do the lawyers who have argued before the justices, it is important to remember that American law schools hire lawyers rather than scholars to teach law. Whereas legal scholars have the doctorate in law, the J.S.D. (Doctorate in Juridical Science), to practice law one needs only the undergraduate, or first, degree in law (the LLB or JD).  The LLB nomenclature was changed in 1900 at the new law school of the University of Chicago as a marketing ploy to attract students. Students had been complaining about having only a BA and LLB (two bachalors degrees) after seven years of college. Even with the name change, however, having the first degree in Liberal Arts and Sciences and the first degree in Law still constitutes two undergraduate degrees.  One must go on in the same body of knowledge to graduate degrees before one can be considered to have mastered it (i.e., masters degree) and then to be a scholar of it (i.e., doctoral degree).  To treat a lawyer with one degree in law as though he or she were thereby a scholar of law omits two degrees of law (the LLM and JSD).

So when Benjamin Barton, who teaches law at the University of Tennessee, says, “I am a law professor and have been quite interested in this case,” we ought to view him rather as an instructor rather than professor because he is not a scholar of law (i.e., having earned the JSD degree).[2] A clue to his true situs, educationally speaking, is in his next statement: “I had a pretty hard time following those arguments.”[3]  He was referring to the oral arguments. Benjamin Barton is a lawyer who teaches law as an instructor. We should neither blame him nor be particularly surprised that some of the arguments eluded him.

In the E.U., by the way, one must have the equivalent of the LLB/JD, LLM, and JSD degrees to join a faculty of law as a professor. In fact, one must have published one’s dissertation and published another book too, at least before one can become a full professor (it might be a requirement even to become an assistant professor). I know such a law professor, and his legal education goes far beyond a year and a half of survey courses and a year and a half of senior seminars. By “beyond,” I do not simply mean more seminars.

A doctorate is more than just a few additional years of classes. One must sit for long comprehensive exams (over anything in the discipline), and oral exam, as well as write and defend a book-length work of original research (i.e., a dissertation). So adding another year to a physical therapy program does not make the degree a doctorate. Also, a doctorate must be the terminal degree in a body of knowledge, whether or not a particular school offers the degree. Lest a masters be presumed to be a terminal degree (e.g., the MFA), the comprehensive exams and dissertation must also be part of the degree (as well as advanced seminars).

Just as everyone today is a professional, we as a society have a habit of naively assuming that someone is an expert simply because they claim to be one. Whether journalists who presume to know how the inner workings of the U.S. Supreme Court or lawyers who teach law yet somehow have trouble following the Court’s oral arguments, a good bit of self-restraint is called for in terms of self-entitlement.


1. Mike Sacks, "Supreme Court Health Care Decision: Individual Mandate Survives," The Huffington Post, June 28, 2012.
2. Adam Liptak, “Justices’ Celebral Combativeness on Display,” The New York Times, April 3, 2012. 
3. Ibid.