Monday, December 31, 2018

Enabling Non-Empathetic Leaders: The Case of Paterno at Penn State University

In January 2011, the illustrious football coach at Penn State University, Joe Paterno, learned that prosecutors were investigating his longstanding assistant coach, Jerry Sandusky, for sexually assaulting young boys in the football team’s locker room. Paterno even testified before a grand jury on the matter that month. He had been informed of the rapes back in 1998, yet he had kept the pedophile on even though additional boys would be at risk in doing so. 
That same month—January 2011—Paterno also began negotiating to amend his contract that would not expire until the end of 2012. By August 2011, Paterno and the president of Penn State reached an agreement in spite of the fact that both were by then embroiled in the Sandusky investigation. “Paterno was to be paid $3 million at the end of the 2011 season if he agreed it would be his last. Interest-free loans totaling $350,000 that the university had made to Mr. Paterno over the years would be forgiven as part of the retirement package. He would also have the use of the university’s private plane and a luxury box at Beaver Stadium for him and his family to use over the next 25 years.” 
The university’s board was kept in the dark. Directors who raised questions were “quickly shut down.” In the end, the board gave the family virtually everything it wanted. The board even threw in free use of specialized hydrotherapy message equipment at the university for Paterno’s wife. In other words, Paterno (and his surviving family, following his death in January 2012) got an even better deal as the scandal came to include Paterno himself.

 Joe Paterno, head football coach at Penn State, viewed by a student as "Pa" in PA        Matt Rourke/AP

The full essay is at "Enabling Non-Empathetic Leaders."

Monday, September 10, 2018

Just the Facts: Empirical Social Science Overplayed

Tilburg University in the E.U. is known to have an emphasis on empirical studies in the social sciences (including business). With this bent, the university is typically considered to be closer to the American academic tradition than that of Europa. So when Dr. Diederik Stapel, a psychology professor at Tilburg, acknowledged to having committed academic fraud in several dozen published articles in academic journals, the academic status of empirical research itself was thrown into question. Experts point out that Stapel “took advantage of a system that allows researchers to operate in near secrecy and massage data to find what they want to find, without much fear of being challenged.” Indeed, it is rare even for peer-reviewers of potential articles to demand to see the raw empirical data supporting a given study’s conclusions. According to Dr. Jelte Wicherts, a psychology professor at the University of Amsterdam, the problem of data being misused by the scholars who collect and analyze it is widespread in the discipline of psychology.

In a survey of more than 2,000 American psychology professors, Leslie John of Harvard Business School found that 70 percent had acknowledged (anonymously) to cutting some corners in reporting data. Add to this the problem of unintended statistical errors and the problem of being able to rely on scientific results becomes acute. Dr. Joseph Simmons, a professor of psychology at the University of Pennsylvania’s Wharton School of Business says, “We know the general tendency of humans to draw the conclusions they want to draw.”

Indeed, the “academic” field of corporate social responsibility has been rife with “scholars” writing to impose or justify their critical ideology of the modern corporation. For example, at Amiti Etzioni’s conference at Harvard Business School on his theory (or movement?) on socio-economics, one professor demanded that the participants form a labor party. The Harvard professors in attendance pointed out that Etzioni was simply trashing the neo-classical economic paradigm (economic liberalism, or free-market competition) without proffering an alternative theory. This did not stop Dr. Etzioni from continuing to advance his agenda, which I submit was precisely to condemn the neo-classical economic theory. Similarly, “scholars” of CSR tend to presume that corporations have an obligation to share corporate governance with stakeholder groups and give more philanthropically. Never mind that the purported obligation is typically not justified beyond the “scholar’s” own ideology. I would be surprised if the empirical research was not highly skewed in the direction of that ideology.

Of course, the problem of empirical science is not limited to disciplines such as psychology and business & society, which are particularly subject to ideology. Once I sat in on a doctoral seminar on strategy. The professor, who would go on to get tenure at a major business school, advised the doctoral students to check with the managements of the companies they are surveying before publishing the results in case any of the managements do not like the conclusions. Otherwise, the “professor” observed, consulting opportunities might be diminished. That several of the students had been bankers and would be conducting empirical studies of the financial sector ought to concern anyone who has heard of “too big to fail” and the related over-reliance on models designed to manage risk.

So whether in dealing with human psychology or huge financial firms, skewed empirical research can be dangerous. Politically, the CSR agenda could result in too much power being amassed by stakeholder groups at the expense of property rights. Moreover, the discipline of psychology (and that of business ethics) suggests that the emphasis on empirical studies, particularly at American universities, is ahistoric. Before the twentieth century, psychology was part of philosophy. Perhaps the problems with empirical science might lead to a re-consideration of the value of philosophical psychology in terms of knowledge as well as practice. Similarly, the interlarding of business ethics (a subfield of ethics, which in turn is a field of philosophy) with empirical surveys—as if what is counts for what ought to be—can be questioned. Rarely does a business ethicist stop to wonder why philosophers do not send out surveys as part of doing philosophy. David Hume’s naturalistic fallacy provides a good explanation for why they do not.

My overall point is that the value of empirical studies in the social sciences (and applied philosophy) have been overstated, particularly at American universities, while theory development and the historic housing in philosophy have been relegated or dismissed outright. Along with the hypertrophy in empiricism has come a “cubby-hole” mentality wherein Frederick Taylor’s specialization of labor has somehow been applied to scholarship. One could excuse business schools for conflating what they are studying with what they are. The problem is when the academic enterprise itself comes to resemble enterprises that make widgets. It is no accident, I submit, that the twentieth century will not be known for many bright spots in the social sciences or philosophy. One could say that Plato and Nietzsche make good book-ends, with engineers and natural scientists taking over to produce a technological and information revolution. Yet who asks what the opportunity costs have been in reducing progress to the technological variety? What cost was there in the twentieth century in having technicians and ideologues for philosophers, rather than thinkers capable of seeing the big picture and proffering unique vistas? If the case of Dr. Stapel comes as a surprise, it might be because we have become too ensconced with “facts” at the expense of meaning.

Benedict Carey, “Fraud Case Seen as a Red Flag for Psychology Research,” The New York Times, November 3, 2011. 

Saturday, February 17, 2018

Physicians and Lawyers: On the Presumption of Ignorance

It would surprise virtually every American (but only a few Europeans) to know that neither the JD nor the MD degree is a doctorate.   Each one is the first degree in its school, or discipline.  Yet we presume them to evince advanced knowledge, even allowing people with two undergraduate degrees to be "professors" (really instructors) in American law and medical schools. In the school of law, the sequence of degrees is: JD (same as the LLB), LLM (hint: M...Masters), and JSD (Doctorate in Juridical Science). The JSD degree includes advance study, a comprensive exam (an academic exam graded by faculty--not a industry-qualifying exam like the bar), and a defended dissertation. A doctoral degree must be the terminal degree of a field, contain a comprehensive exam, and include significant original research in a defended dissertation. The JD misses on all three points. The title of the first degree in law, the LLB (bachalors in letters of law) was replaced with "JD" largely for marketing purposes in 1901 in the founding of the U of Chicago law school (by three Harvard professors) because prospective students were complaining about having two "B" degrees after seven years of school.  People don't like to think they have gone to school for seven or eight years for two undergraduate degrees, but this is precisely what they have done. Nevertheless, the new law school in need of students complied with the "customer" complaint with a feat of mirrored marketing that was perhaps intentionally ambiguous.  To eviscerate the ambiguity in  Juris Doctor and a doctorate, one must look beyond the mere words.

In medicine, the MD is the first degree. Substantively, it contains survey courses and some seminars, just as in a BA or BS program in liberal arts or sciences. The D. Sci.M. is the doctorate in the field of medicine, and the M.D. is a prerequisite (so the latter cannot be the terminal degree of the field).  The fact that some schools give the D.Sci. M. degree as an honorary degree does not mean that it does not exist elsewhere as the real, terminal degree. Particular medical schools may give the degree as honorary where there are not enough prospective students interested in a doctorate in medicine. 

In divinity schools, the M.Div (before 1968, called the BD) is the undergraduate degree. It is followed by the STM (the masters) and the DD.   When the BD name was changed to the M.Div name, a perhaps-deliberate ambiguity was created wherein one apparent masters would be followed by another (M.Div. and STM).   It evinces a category mistake to have two masters degrees with one being substantively prerequisite to the second. Substantively, the M.Div. program consists of a year and a half of survey classes, followed by senior seminars (just as in the undergraduate law, liberal arts & science, and medical programs).  To regard a graduate with a M.Div., JD, or MD as having achieved advanced knowledge in the respective field is a fallacy perpetuated by the superfluous esteem we heap on the "professions" on account of their association with money (the religious vocation being revered for sacrificing the vaunted wealth).

It makes no difference how many degrees a person has in other fields before commencing study in a professional school. In beginnning to study law, medicine or theology, one begins with survey courses. Furthermore, it doesn't matter whether one's particular school or even country offers the doctorate in the field.  Try telling people that your BA is a doctorate in English because no Ph.D. in the field is offered at your college or even in one's country.  Every field (just like life itself) has a first degree and a terminal degree.  A student does not obtain advanced knowledge in two or three years in a law, medical or divinity school, but only a first degree's worth in liberal arts and sciences.

Sadly, we as a people have esteemed the physcians and lawyers so much that we have vaunted them by unwittingly appreciating their degrees into the stratisphere.   One degree in a given field does not a doctor make.   Europeans have been correct in refusing to call an American physician, "Dr. Smith."   The fact that Mr. Smith would take offense just points to the arrogance that lies in ignorance.  The rest of us enable Mr. Smith to claim the doctoral title before his last name because we don't know any better.   We give physicians titled trophies that they do not deserve.  Moreover, the use of vocational titles (including Professor Jackson) risks a vocational reductionism wherein a person is regarded (and comes to regard himself) as that which he or she does. Is vocation really so important that it eclipses or overcomes a person's identity?

Maybe it is time that we say "enough is enough" on the green glitter and deflate those who have vaunted their own entitlements going along with being a  professional  to a value or level more fitting to what they have earned.   The extent of illusion that a society can create and maintain is astonishing, yet being in the illusion (think here of the Matrix) we do not see it.  It is time to see the green numbers on the wall.  No wonder even the hint of such sight is apt to incur the wrath of the agents who instinctively protect the illusion because they benefit inordinately from it.   It is time, ladies and gentleman, that we wake up, as the sun is already quite high in the sky and there is much to be done.

Monday, October 16, 2017

Paul Samuelson: The Model 20th Century Economist

Paul A. Samuelson, the first American Nobel laureate in economics and the foremost academic economist of the 20th century, died at the end of 2009 at 94.  Samuelson was credited with changing the academic discipline of economics, according to The New York Times,  ”from one that ruminates about economic issues to one that solves problems, answering questions about cause and effect with mathematical rigor and clarity.”  Essentially, he redefined twentieth century economics. Mathematics had already been employed by social scientists, but Dr. Samuelson brought the discipline into the mainstream of economic thinking. His early work, for example, presented a unified mathematical structure for predicting how businesses and households alike would respond to changes in economic forces, how changes in wage rates would affect employment, and how tax rate changes would affect tax collections.  He developed the rudimentary mathematics of business cycles with a model, called the multiplier-accelerator, that captured the inherent tendency of market economies to fluctuate.  Mathematical formuli that Wall Street analysts use to trade options and other complicated securities (derivatives) have come from his work (FYI: derivatives too complicated for outsiders such as the government to understand/regulate were at the center of the financial crisis in 2008).

While The New York Times article covers his career in a positive light, I believe the picture is more complicated—and telling of twentieth-century American society.  At the surface, the tale seems to center on a dichotomy—the Keynesian liberal against his conservative monetarist friend, Milton Friedman.  Perhaps the principal issue between them was whether market equilibrium could rest at full employment (i.e., without government help).  Samuelson’s own work on the inherent volitility of markets would suggest that the market mechanism does not necessarily reach an equilibrium, even at less than full employment.  As we saw in September of 2008, a market can collapse from within.  I am reminded of Alan Greenspan’s testimony before Congress shortly thereafter, when he admitted a fundamental flaw in his free market paradigm assumptions.  Clearly, more thought is needed into the nature of a market and how its basic contours can be altered; government regulation alone is not sufficient.

Unfortunately, such “big picture” theorizing was on the wan in twentieth-century economic thought, which focused on narrow problems using technical tools such as mathematical formulas.  To be sure, Samuelson’s technical work gives us reassurance that the market contains a fluctuating element.  However, the reform of an economic system at a basic level is not simply the sum of a bunch of smaller solved problems.   I submit that while mathematics is useful for problem-solving, more is needed to understand our economic system and alter the basic contours of the market mechanism.

  Fundamentally, none of the social sciences is really a science.  To presume the certainty of natural science onto any of them is inherently limited and potentially risky.  To be sure, value can be gained from applying quantitative tools to look at limited problems, but the inherent indeterminacy of human macro systems makes the scientific approach ultimately futile from the macro standpoint of the social “sciences.”  Their phenomena, in other words, are not of the sort that can be measured andpredicted like the speed of a comet in space or a chemical reaction in the isolated environment of a lab.  Economic, social and political systems just aren’t like that.   Explanation, rather than prediction, is primary where human indeterminacy is so salient.

Another way of relativising the “mathematical problem-solving” orientation of 20th century economics is to look at different levels of thinking.  In the wake of the problem-solving orientation, business schools regularly tout “critical thinking,” which is really just problem-solving.  You wouldn’t know it, but higher forms of thinking do exist—namely, synthetic and analytical reasoning.   To treat problem-solving as the litmus test for a discipline is to reduce that discipline from what it could be, academically speaking; it is to short-change it by forcing it into the low-ceilinged box of practicality.  It is to put blinders on. Samuelson’s mathematical axis inadvertantly made the discipline of economics more oriented to solve particular problems than it had been in the past.  Consider by contrast the work of Smith, Marx, Hayak, and Veblen—not a plus or minus sign among them, yet their work addresses economic at the level of systems.  Moreover, their thought transcends mathmatic problem-solving.

I am not dismissing the value of solving specific problems, and Dr. Samuelson deserves credit for providing the tools for it; rather, I am suggesting that the legacy of the twentieth century in general and economic “science” in particular might be a reductionism to a technical orientation to solve particular problems.  That is to say, empiricism as hegemonic.  Problem-solving as the principal activity (and reasoning).  Such an orientation is rather narrow, and therefore not apt to survive on top indefinitely.  The “big picture” questions raised by the financial crisis of 2008 include matters like “too big to fail” and the viability of the market-mechanism itself that go beyond solving particular problems.  So I would not be surprised if a return to the theoretical economy (and political economy, for mathematics in the latter has been part of the wedge that has artificially disected the two) were not too far off.   The twentieth-century is leaving us.  I for one have few regrets over its passing; I think it will go down in history as decadent (meaning decaying from within..the 1970’s being its epitome).  What Samuelson did for economics is more a function of his era than anything else.   Such value is limited.


Thursday, May 11, 2017

The University of California: University Governance Gets an “F” on Trust

As part of the government’s 2017 audit of the University of California’s president’s office, California’s auditor, Elaine Howle, sent surveys to administrators at the university’s 10 campuses. The president’s staff directed administrators at the Santa Cruz, San Diego, and Irvine campuses to remove criticism of the office and give higher performance ratings in key areas. The interference was blatant, as it included even a systemwide conference call. As a result, Howle disregarded all of the results as tainted. The audit also uncovered $175 million in undisclosed reserves being held by the president’s office. Janet Napolitano, the U.C. president and former head of the U.S. Homeland Security Department, had betrayed the trust vested in her. The ineptitude likely ran higher, and lower. That is to say, the university’s governance itself was culpable.

For an office with a $686 million budget (the entire university’s budget being $31.5 billion in 2017) and nearly 1,700 employees to betray the trust of the university’s board of regents, the Government of California, and the general public is, as Assemblyman Phil Ting said, “outrageous and unbelievable.”[1] Ting compared the interference to a student who is failing “and magically the professor changes the grade and passes the student.”[2] In fact, the duplicity went beyond Napolitano’s office, for Howle had directed the administrators at the campuses to keep the surveys confidential and yet one UCSF administrator felt entitled to inform Napolitano’s staffers, who in turn began directing administrators on how to respond to the surveys. George Blumenthal, chancellor of the Santa Cruz campus, sent an email to his staff noting that the president’s office was not happy with a long paragraph, so he added, “I suggest you remove the paragraph and submit it.”[3] That a spokeswoman for the president noted that the chancellors had “not been shy in offering opposing views” to that of the president can thus be taken as yet another attempt to mislead.[4]

The irony is that California’s tax-payers had been funding “profligate” salaries of university administrators even as funding cuts mandated by the legislature had hit other areas of the university.[5] For their part, faculty members were not surprised—faculty leaders noted that cynicism had crept in for years as the university governance had increasingly sidelined their voices.[6] Considering both the healthy slush fund and the efforts to manipulate the audit’s survey, as well as the sordid reputation of the university’s administration among the ranks of faculty, the conclusion may be that the university’s board of regents had failed to provide adequate oversight. In other words, the weak link may actually run higher than into the president’s office.

[1] Nanette Asimov, “3 UC Campuses Change Responses in State Auditor’s Survey,” San Francisco Chronicle, May 10, 2017.
[2] Ibid.
[3] Ibid.
[4] Mike McPhate, “California Today: A Cloud Over the University of California,” The New York Times, May 11, 2017.
[5] Ibid.
[6] Ibid.

Tuesday, May 9, 2017

Student Teaching-Assistants Hunger-Strike at Yale: Facing an Implacable Wall

During the Spring term of 2017, some graduate students at Yale began a hunger strike to pressure the administration to negotiate with their union. At the time, about 70 percent of the instructors at American colleges and universities were part-time—including adjunct instructors and graduate students working as teaching assistants. They were poorly paid and lacked “access to affordable health care, job security or a voice in their working conditions.”[1] I contend that we should not gloss over the real differences between adjunct instructors and teaching assistants, the latter contains an employment element that warrants representation by a union.

Graduate students who work as teaching assistants hunger-strike in front of Yale's administration building (to the right). Directly behind the protesters is the Commons dining hall (which I remember for the Belgium waffles...the gym being fortunately close by).  (Source: NYT)

To be sure, the position of a graduate student leading discussion sections of a professor’s course is quite different than that of an adjunct instructor teaching a class or two per term at a university. A graduate student only works as a teaching assistant for a few years, and upon graduation one can look forward to beginning a career; even if as a professor, that vocation is not merely an extension of being a teaching assistant. The unique academic properties of the teaching-assistant role are borne out by the fact that only students qualify. The compensation is a stipend, typically viewed as a form of student financial aid, and the teaching role is designed to teach the student how to teach—and even provide the student with additional knowledge.

As a teaching assistant at Yale, I jumped at the opportunity to teach the History of Modern China and the History of European Integration (e.g., the EC and the E.U.) precisely because I could learn more than what was offered in the courses I was taking. I was by no means a student—not to mention an expert!—of China or the European Union. The Yale administration held that its graduate students could aptly lead discussion sections on material outside of our main area of study because we learn so well. So it is strange that the administration during the Spring term of 2017 hired union-busting lawyers to argue “that for many of the courses [the TAs] teach, these graduate students ‘have no subject matter expertise’ and therefore don’t qualify as professors.”[2] No TA would claim to be a professor! More to the point, Yale’s position, through its lawyers, concerning the lack of subject-matter expertise is misleading, giving the learning aspect of being a teaching assistant—learning not merely how to teach but also about the content of the course. Yale’s administration can be astonishingly stubborn—and I wouldn’t be surprised if the hunger strike weren’t at least in part a reaction to the passive aggressiveness itself. I suspect that its root lies in power and felt superiority; union representation could hardly make a dent economically in such a rich university.

In August, 2016, the National Labor Relations Board had ruled that graduate students engaged in teaching at private colleges and universities are indeed employees and therefore have the right to collective bargaining. The decision reversed a ruling in 2004, which had held that TAs “are primarily students and have a primarily educational, not economic, relationship to their university.”[3] The ruling in 2016 found that the broader relationship does not mean that the teaching role—performed on a paid basis—is not work. In short, the students are also employees. A student who works in a dorm cafeteria—such as me at my first university—is an employee in that job even though being a student is the broader status at the university. To be sure, working as a teaching assistant involves learning—both how to teach and subject-content—but the tight relationship between the work-tasks and pay render the position a job, and thus entitled to be represented by a union.

[1] Jennifer Klein, “Why Yale Graduate Students Are on a Hunger Strike,” The New York Times, May 9, 2017.
[2] Ibid.
[3] Noam Scheiber, “Grad Students Win Right to Unionize in an Ivy League Case,” The New York Times, August 23, 2016.

Friday, April 21, 2017

On the Spread of Private Governments in a Democracy: Should Churches and Universities Have Their Own Police Forces?

In mid-April, 2017, Alabama’s Senate approved a bill that would authorize Briarwood Presbyterian Church to create a police department. At the time, the church hired off-duty police employees to provide security-- “a common practice among nonprofit organizations.”[1] With 4,000 congregants, a K-12 school and thousands of events on its land each year, church officials had difficulty finding enough off-duty cops who were available. More important than being able to make up for any shortages, the proposed law “would empower a religious group to do a job usually performed by the government.”[2] That the group is religious in nature whereas police power is governmental (i.e., “church and state”) is less important than that the “job” had come to be viewed societally, as per the quote from The New York Times, as usually performed by government. In other words, the slippery, subtle slope is itself a red flag.

The full essay is at "Private Police Forces."

1. Ian Lovett, “Alabama Church Wants Police Force,” The New York Times, April 17, 2017.
2. Ibid.