Saturday, March 21, 2015

American Students View College as Job-Training: Forsaking Education?

“What is your major?” is a mantra (and undoubtedly a pick-up line too) on college campuses. In giving students some exposure to a variety of academic disciplines, distribution requirements are meant in part to help students make more informed decisions of what to major in. According to an analysis of twelve randomly-chosen American colleges and universities in 2015, an increasing percentage of students since the recession of 2009 were circumventing this help by declaring their respective majors during their freshman year.[1] The reason, according to the business newspaper, is pragmatism, student debt-loads, and a difficult job market. “In 2012, nearly half of college graduates between the ages of 22 and 27 were unemployed or had jobs that didn’t use their degrees.”[2] In response, a higher proportion of students were going to college to get a job. Although The Wall Street Journal lauds the reduction of education to vocation, even more striking is how even academic administrators mischaracterize the intellectual mission of colleges and universities.  

The associate vice president for enrollment and marketing at DePaul, for example, told the journal, “People don’t go to college anymore to be fulfilled or to gain life perspective; they go to get a great job.  . . . There’s been a shift from hippie culture to corporate culture.”[3] In overgeneralizing becoming knowledgeable through formal learning being “fulfilled,” the administrator makes the benefits seem so vague they could be gained in other ways.

To claim that gaining knowledge in, say, chemistry or mathematics gives a person “life perspective” treats a distant byproduct as if it were the main point. A survey of freshman at American colleges and universities in 2014 found that 45% of them believed that “an essential or very important objective of college was to develop a meaningful life philosophy.”[4] In 1971, 73% had that belief. Those students who cited the belief got higher education wrong because they missed the obvious point that an education makes a person knowledgeable whether or not he or she uses the knowledge to develop a life philosophy.

Were higher education primarily to give students a meaningful life philosophy, the benefit would hardly be worth the cost in student loans alone. It is no surprise, given this “understanding” of the purpose of a liberal arts and science education, that 82% of the surveyed freshman in 2014 said college is essential to being very well off financially. In 1971, 37% thought so, and in 2006 the percent was up to 73.[5] In other words, shortchanging the real benefits of being knowledgeable as distinct from being skilled has facilitated the shift to declaring a major in one of the professional schools early.

Of course, studying chemistry, for example, can result in a job. In fact, the higher-order analytical and synthetic thinking that goes beyond the critical thinking (skills-training and problem-solving) in professional schools can actually be extremely valuable on the job as well as in virtually any domain in life. A student might major in accounting, for example, only to find that he is bored to death on audits. Having used his electives for business classes so he could graduate on time—having switched majors after his sophomore year from biology—he would have had no inkling that his mind was most suited to philosophy. Meanwhile, he would never use the skills he had learned in all the accounting courses he took.

I submit that resisting the temptation to reduce a college education to vocational training at least until the student has had some exposure to a variety of academic fields through distribution requirements is not “hippie culture.” As the assistant dean for academic advising and career counseling at UT-Austin puts the problem, “How do you know that you don’t want to major in say, anthropology, if you’ve never taken an anthropology class?”[6] He points out that students who choose a major during their freshman year are likely to switch, and this can delay their graduation date or make it more likely they will drop out.

In fact, the hippies of the late 1960s trivialized knowledge by presuming that they could teach themselves in “teach-ins.” These were not only a pragmatic part of the Vietnam-War protests on college campuses such as UC-Berkeley and UW-Madison; rather, the egalitarians decided that being well-learned is not a prerequisite for teaching the knowledge. Such an attitude toward knowledge is perhaps even more dismissive than that of the job-oriented freshmen who view academic knowledge as worthwhile only or primarily for developing a life perspective.



1 Douglas Belkin, “Freshman Are Picking Their Majors Earlier,” The Wall Street Journal, March 20, 2015.
2 A 2014 paper by the Federal Reserve Bank of New York. Quote taken from Belkin, “Freshman.”
3 Belkin, “Freshman.”
4 Ibid.
5 Ibid.
6 Ibid.

Saturday, January 31, 2015

Universities and Hospitals: Time for American States to Tax Nonprofits?

The 2015 budget that Gov. Paul LePage proposed to the Maine legislature takes aim at the “sacred cow” of property-tax exemption for nonprofit organizations. Colleges and hospitals, for example, would be levied a property tax, with places of worship and government-owned entities remaining exempt. The rationale is that of fairness to home-owners, who must bear a disproportionate weight particularly in New England, where colleges and hospitals in particular are ubiquitous. However, I submit that a second justification exists—one based squarely on the colleges and hospitals themselves.


The full essay is at “Universities and Hospitals.”

Donations to Colleges and Universities: A Widening Inequality among Schools

For the fiscal year ending June 30, 2014, colleges and universities in the American States received a record $37.5 billion in donations, which represent a 10.8% increase from the prior year.[1] Alumni giving increased 9.4% to $9.85 billion, with the average graduate gift up more than 25% to $1,535.[2] The top ten universities by amounts raised snagged 18% of the total ($37.5 billion), thus doing disproportionately better than even the schools in the middle stratum. What is behind this growing economic inequality, and can anything be done to reverse the trend?

Ann Kaplan, at the Council for Aid to Education, explains that the colleges and universities in the middle of the pack are less likely to have as active, wealthy, or well-connected alumni.[3] I suspect, however, that the level activity takes a nosedive at the level of local universities, which tend to operate as commuter schools. Also, while the propensity to do well financially years or even decades after graduation is likely enriched from having received an excellent education, the top ten universities by donation level in 2014 do not match up with the rankings based on quality of the education. To be sure, Harvard and Stanford came in first ($1.16 billion) and second ($927.5 million), respectively, but they are followed by non-Ivies including the University of Southern California, Northwestern, and John Hopkins. Princeton, Yale, Brown, and Dartmouth did not make the cut, whereas the University of Washington did.

Surely, the performance depends at least in part on the quality of the university’s development office, and schools in the middle of the donations-ranking can certainly invest in that. In fact, improvement may not hinge on increasing budgets. For example, “(a)s schools get better at tracking down alums due to the growth of social networks such as Facebook and LinkedIn, they’re able to tap those graduates for targeted gifts to support student scholarships and new buildings, among other things.”[4] Encouraged by the financial bonus system, I made a habit while working part-time at Yale’s development office while I was a student at the university of asking the alumni whom I called what they liked about their experience at Yale. I would then casually mention that I could target a donation to a particular residential college, or group such as the Glee Club. Even a small donation would make a difference if narrowly targeted, and I did well on the bonus contests.

Beyond improving fundraising skills and tapping into the great opportunities available through social media, university administrations bent on increasing donations could do the unthinkable: get the staff to actually be nice to students, as they stand a good chance of being alumni one day. This incredible insight is not as easy as it might seem. At the University of California at Davis, for example, university police decided to use pepper-spray on students sitting on a campus sidewalk to protest a tuition hike in late 2012. For the university administrators at Davis, a heads-up: spraying pepper spray in students’ eyes at close range is not the best way to get them to donate years later. Just saying.



[1] Melissa Korn, “Harvard Tops Record Year for College Gifts,” The Wall Street Journal, January 28, 2015.
[2] Ibid.
[3] Ibid.
[4] Ibid.

Thursday, January 15, 2015

Should the U.S. Government Have a Role in Elementary and Secondary Education?

In a speech in January 2015, U.S. Education Secretary Arne Duncan urged a continued central role for the federal government in education policy. He said the president was proposing to increase federal spending on elementary and secondary schools by $2.7 billion; Congress had appropriated $67 billion to the U.S. Department of Education—with $23.3 billion for the Elementary and Secondary Education Act—in the 2015 budget.[1]  Typically, debate on the federal government’s role had focused on the use of standardized tests in holding schools accountable. I submit that a self-governing people has a duty to consider the wider implications, such as the impact of a greater role on the federal system. Otherwise, unintended consequences may show up after it is too late to do anything about them.

The full essay is at "Federal Policy in Education."

Thursday, November 20, 2014

Ethical Theory in Business Ethics Courses

It may seem like an oxymoron, but faculty administrators at even research universities can be hopelessly narrow-minded regarding knowledge and how it is to be conveyed. For example, how often are faculty members encouraged to give a lecture or two re-teaching material largely missed on exams (followed by another, shorter examination on that material)? Do faculty administrators work with faculty members in professional schools to see to it that the applied courses are not severed from their basic (i.e., more theoretical substratum) discipline? One of the secrets in the “sauce” at Yale’s professional schools (e.g., Law, Divinity, etc.) is this salience of the respective basic disciplines (e.g., political theory and theology, respectively). Synergy comes gushing through once the false dichotomy is recognized. Before I went to Yale, I was a masters and doctoral student at the University of Pittsburgh, where the dichotomy was alive and well in the university’s social reality; I had to “walk back” the dichotomy myself as I discovered philosophy (and religious studies) while I was still studying business.

Business ethics was one of my doctoral fields of study at Pitt. The philosophy department there was at the time one of the best in the U.S., so used an elective to take my first course in the discipline. I began with two intro courses; before I knew it, I was taking junior and senior courses, such logic and philosophy of mind. The latter course turned out to be the most intellectually intense course I took in my 18 years as a university student (had I discovered philosophy in college, I would have three rather than five degrees). It occurred to me at the time to start taking ethical theory courses, as business ethics was one of my doctoral fields. Within philosophy, I gravitated to practical philosophy—in particular, to ethics, political theory, and philosophy of religion. I treated these as foundations for the field of business, government, and society in business.
It dawned on me that none of the business doctoral students concentrating in business ethics had taken an ethical theory course in philosophy. That is to say, I was stunned to find a subfield of ethics reduced to management. Ethics proper is a subfield of philosophy, not business; ergo, business ethics is ultimately grounded in philosophy, with managerial implications. I think business schools have put the cart before the horse and letting go of the horse. A cart without a horse isn't going to go very far (though perhaps it can go in circles).

From my educational experience, I contend that ethics courses in business schools ought to emphasize ethical theory, with managerial implications/applications used as much to illustrate the theories as to understand the ethical dimension of business. Managers in the business world have told me that business schools should do what corporate training cannot, rather than being duplicative. I think deans miss this point, perhaps because they are so oriented to sucking up to corporate managers in order to get corporate donations. In my own thinking, theory enlivens rather than detracts from praxis. I think business school faculties are in the grips of the false dichotomy. Corporate managers would doubtless admit that they are ill-equipped to teach ethical theory. Moreover, training is a better fit with what corporate folks do. Business schools, or else philosophy departments, could offer regular as well as continuing education courses in business ethics with ethical theory readings, lectures, and discussions going beyond the superficial “rights, utility, and justice” hackneyed reductionism of business ethics courses in business schools. 

Sunday, November 2, 2014

Exclusivism Eclipses Veritas at Yale

Michael Simons, head of the cardiology department of the School of Medicine, made unwelcome sexual advances in writing to Annarita Di Lorenzo,  a researcher 18 years younger at the school in 2010. Simons wrote that he wanted to kiss the woman’s lips, and every part of her “body in every continent and city of the world.”[1] Referring to Frank Giordano, the woman’s boyfriend at the time and subsequently husband, Simon wrote that the woman was choosing the wrong man.  Simons would keep Giordano from important meetings and assignments. The relationship between the two men became so difficult that Jack Elias, the chairman of medicine, took over the direct supervision of Giordano to protect the untenured instructor from Simon.
Nevertheless, Yale’s provost, dismissed a university committee’s recommendation that Simons be permanently removed from his position in favor of an 18-month suspension. Faculty members claimed that Simons’ success in snagging $5 million annually from the U.S. Government in grants in 2012 and 2013 had been a factor, as well as the fact that the provost had been chair of the economics department, where Simons’ wife was a faculty member. The monetary element would not be lost on virtually any academic administrator at any university, but the “old boys club” sticky web of connections at the elitist Yale could mean that “outsiders” suffer considerable abuse there; the Provost’s dismissiveness of the university committee’s recommendation is but one indication of how distorted the moral compasses can be among the most powerful in the “club.”
As most Yalies undoubtedly know, incremental exclusivisms exist within Yale; the pleasure for a faculty member in being selected to a higher post and a student in being “tapped” to join one of the secret societies strangely depends in no small measure on being able to see others excluded (i.e., hurt). In other words, rather than finding the sheer opening of a door to be fulfilling, the true pleasure lies in watching the door hit others in the ass the hinges swing back.
For example, as a student at Yale, I spent an “all-nighter” going through the admissions ritual of one of the debating parties in the Yale Political Union. Once I was in, the party’s chairman suggested that I attend a party that would be on an upcoming Friday night. The party owns a secret society,” he explained to me, “so you and all other guys in the party can join the society.” Assuming that I would be able to join the society too, I cancelled my preexisting plans and attended the party, which was in a room half way up the university’s clock tower. In actuality, only three people were tapped. They had all been officers in the party’s elite, so I quickly realized that the chairman had wanted me to attend not because I would be tapped, but, rather, in order to watch his friends being selected. The chairman subsequently lied to party-members—saying that I had misunderstood him. Whereas most party members in such a situation typically resigned with great fanfare during one of the weekly debates, I did not resign; instead, I simply had nothing to do with the party or its little cadre of officers. To this day, I am a member of a party in the Yale Political Union.
My point is that deceit used to protect the club within the club at Yale is not a rare deed. Perhaps you can grasp more fully why De Lorenzo may have felt no avenue left to her but to leave the university,  why her husband’s career was effectively stalled by a jealous faculty administrator, and why the provost dismissed the committee’s recommendation. The pyramidal exclusivism within Yale is so strong that unfairness and even outright aggression may seem justified to some.


1. Tamar Lewin, “Handling of Sexual Harassment Case Poses Larger Questions at Yale,” The New York Times, November 1, 2014.

Thursday, September 18, 2014

Can Ethical Leadership Be Taught?

Can ethical leadership be taught? In the typical business school, this question would be interpreted, or “refurbished.” Can students be trained to become ethical leaders? While often conflated contemporaneously, these two questions are indeed distinct. Instructors, professors and school administrators should first decide which question is more relevant to their purposes. The question chosen should fit with the education, pedagogical method, and philosophy of education of not only the instructor or professor, but also the school itself. In this essay, I distinguish the two questions in order to unpack them with their full significance.

The question, Can ethical leadership be taught, can be interpreted as being centered on knowledge of the concept and theories of ethical leadership. Can this particular knowledge be taught? That is to say, if a student were to ask, What is ethical leadership? could the instructor or professor answer with a definition? Have scholars even come up with an agreed-up definition? More broadly, how does ethical leadership as a concept differ from that of leadership more generally? Do theories of ethical leadership explain it rather than merely being oriented to how to? Furthermore, do any extant theories relate the concept to other, related concepts such as strategic leadership or even strategy? If so, can such theories be taught to the students at a particular level of education? Last but not least, would teaching the theories toward an understanding of what ethical leadership is be in line with the approach of the particular business school? Some schools are more commercially-oriented than others.

The pedagogical goal in line with the question of whether the knowledge we have on ethical leadership can be taught is that students know more about ethical leadership as a phenomenon. Studying ethical leadership in applications, as for instance through case studies, is oriented here to understanding more about the concept through how it is exercised in practice. To be sure, praxis (i.e., in practice) cannot capture the entirety of a given concept. Accordingly, the case studies and even in-class simulations are secondary here.

Moreover, whether or not the students are more able to become ethical leaders themselves is theoretically separate even if, as Plato wrote, knowing to do the good is all one needs to do it on a regular basis. In other words, understanding what ethical leadership is may contribute toward a student being able to practice ethical leadership, but this byproduct is not the immediate goal in getting students to understand what ethical leadership is. In this perspective, moreover, a business school’s faculty studies the phenomenon of business so as to understand it better.

Alternatively, a business school may be oriented to training its students to be practitioners of particular skills in business. Here, the question of whether teaching ethical leadership can facilitate or enhance a student’s ethical leadership skills in practice is relevant. Crucially, we have shifted qualitatively from education to training—from knowledge to skills.

In their article on teaching ethical leadership, Carla Millar and Eve Poole point to character, or moral intuition, as being improvable in the classroom setting if experiential learning is used.[1] Whereas case studies involve examining how someone else exercised, or should have exercised, ethical leadership, experiential learning focuses on the decision-making in the student. “How would I handle this situation,” according to the authors, stimulates a student’s own moral compass, which in turn can facilitate the student’s skills in ethical leadership. Here, how is the operative question word, as it befits skill as primary.

Millar and Poole can be criticized for treating a simulation in a classroom as a “real” situation. Relatedly, lab experiments in psychology are not in the “real” world, so the artifacts of the lab environment must be taken into account in analyzing the data. Even if a simulation or putting the students in the place of an ethical leader in a case study gets the students to get an experiential sense of how to be an ethical leader, it is quite a leap to propose that such classroom activities make students more ethical or more inclined to be ethical should they become leaders. Ethical leadership is not like typing. Therefore, I think Plato’s approach is superior even though it puts knowledge of a phenomenon first. This would be ironic.

In terms of institutional conflicts of interest, being able to recognize them in practice, which is presumably preliminary to an ethical leader being able to avoid them, depends on first knowing what an institutional conflict of interest is. What is the relation between the two roles in a conflict of interest? How do they differ? What is a conflict of interest? Is such a conflict unethical even if it is not exploited?  The scholarly literature is split on this last question. Using case studies to learn how to handle institutional conflicts of interest depends on the student to generalize across a sufficient number of cases to figure out what a conflict of interest is. Most likely, however, an instructor simply assumes that presenting a few conflicts of interest to the students is sufficient for recognition purposes, and thus for being able to deal with them. The problem with this approach is that understanding in necessary for recognition beyond the few cases a student has studied. Moreover, studying what a conflict of interest is makes it more likely that a student will take them more seriously, ethically speaking.

It should be pretty obvious where I stand with respect to the two approaches. I contend that paradoxically learning what we know of an applied concept is the best means of enabling students to use the concept in praxis. Abstract knowledge and praxis are synergistic rather than antithetical. If I am correct here, then business school deans who are under the impression that companies want the schools to train future employees are not even serving those companies well by replacing education with training. Generally speaking, if something looks too convenient, it probably is better to take a more arduous route.




[1] Carla Millar and Eve Poole, “Business Schools Are Failing to Teach Ethical Leadership,” The Guardian, November 26, 2010.