We Are Not Agreed

A few days ago University Ventures authored a piece in response to a post from the New America Foundation comparing Republicans who defend for-profit colleges to climate change deniers. The unattributed University Ventures article argues "this piece re-fights yesterday’s war... the many challenges and opportunities facing higher education lend themselves to bipartisan consensus – perhaps more than any other area of public policy."

Bipartisanship is of course a U.S. phenomenon. But it is worth noting that there are many things U.S. lawmakers agree upon that are opposed in corners around the world. I find myself frequently occupying those corners, and today is no exception. So, setting aside the for-profit colleges debate for another day, I'd like to take the time to point to the points where I disagree with what is taken to the the emerging consensus.

The text in italics is their contention; what follows is my response.

1. Completion is the most powerful lever

The author makes the very reasonable point that "drop rates approaching 50% at many four-year institutions and 80% at many two-year colleges" represents a failure of traditional post-secondary institution, and responds that "there’s no 'free college' silver bullet to the complex completion challenge."

But completion is not the powerful lever that drives everything else; it is the pointless anchor that weighs everything down. It is becoming increasingly untenable to stop everything in your life to complete four or eight years of studies, especially when the mechanisms for delivering that education are increasingly inefficient and expensive.

Indeed, completion would be irrelevant as a lever were it not for the mechanism of granting recognition only at the end of the four or eight year program. To be sure, students value those degrees and diplomas. They have no choice; there's no other way to earn recognition for their learning.

As recognition becomes more distributed, however, we will see other more fundamental levers emerge: the requirement that learning be relevant, that it help us solve problems, that it support networking and collaboration, and that respect our personal interests and abilities.

2. Bachelor’s degree “addiction” is hurting students

The author argues that "it takes a Candide-like idealist to continue to insist that a bachelor's degree is the optimal or only path to establishing the core cognitive and non-cognitive executive function skills that lead to successful white collar careers."

The disagreement here is not whether we're questioning the relevance of the Bachelor's degree. It is actually rather more subtle than that.

First, this second point can be seen as code for "we need to restrict the number of people admitted into Bachelor's programs," with the idea that alternative schools - trade schools, business skills schools, etc. - would emerge to pick up the slack. We see this in the allusion to "successful white collar careers", which already assumes the separation between advanced education and trades. The idea here is that white collar workers are the new tradespeople. But returning universities to their original position of offering education to the elite is not what I would consider a progressive step forward.

Second, this second point continues to carry the presumption that the point of education is to lead you step by step toward a future career. We see this in phrasing like "optimal or only path". The presumption that education amounts to preparation should be challenged. This may be one function of education, but it is not the only one, nor even the most important one. There's no end to the stories about students being 'prepared' for a future that no longer exists. Education should be addressed toward capability, not preparation.

3. Colleges need to do much more to help graduates get great jobs.

The author's point here is that colleges and universities "must ensure students are equipped with the technical skills employers increasingly require for entry-level positions." The idea of colleges and universities being preparation for the employment, whether for one's first job or eventual career, is anathema to many. From my perspective it's not a matter of faith but of common sense: chasing after "what employers want" is a mugs' game you can never win, and is increasingly irrelevant in a world where you make your own employment.

First of all, if employers want certain outcomes from our education system, then why don't they pay for that themselves, rather than requiring governments and students to pay for it?

Second, if employers want certain outcomes from the system, why don't they hire on that basis, rather than on (among other things) college pedigree, connections and friends, biases and stereotypes, proximity, and willingness to work for lower wages?

Third, employers lobby for certain outcomes from the education system - computing science grads, nursing students, engineers, etc. - in order to drive down labour costs. Why should any of us support a mechanism that actually reduces our negotiating position in the marketplace? 

What about that New America survey showing that the reason students enroll is "to improve employment opportunities (91%); to make more money (90%); and to get a good job (89%)"? When you read the survey, you find it is "an online survey of 1,011 U.S. residents ages 16-40, who were largely prospective college students." So this reflects the sales pitch, but does it reflect the reality?

Colleges and universities - indeed, all of education - should help students become self-sufficient. That's what the elite programs do. That's what they should all do.

4. Employers bear much of the blame

We can certainly agree with the author that "Opaque Applicant Tracking Systems and imprecise job descriptions have turned getting in front of a human hiring manager into a 'rigged' game. And "campus-based recruitment at a select number of schools" merely reinforces this perception. Employers (and banks, and venture capitalists) aren't looking for qualifications in new employees; they know that the right person can always adapt to the needs of the position, especially entry-level positions. They are looking for the right pedigree.

That's why the proposed 'agreed upon' solution won't work, and indeed distracts from the core issues. Here's what the author suggests: "utilizing new People Analytics technologies to identify competencies that are predictive of success, incorporating these skills into job descriptions, and proactively searching among passive job seekers and current students will become a competitive advantage for farsighted employers." Nonsense.

If it accomplishes anything at all, identifying competencies will fit only short-term positions for specific tasks. As a mechanism for long term employment and career-readiness, competencies will prove to be an unmitigated failure. Employers will care about a few very general core competencies (can they speak and write reasonably well (and without an accent), do they know the jargon of the field, can they work with other people (and especially our team), do they dress well and bathe themselves, have they done this kind of thing before, are they connected?).

Should it be this way? Of course not. I too would like to see "a shift from degree- and pedigree-based hiring to competency-based hiring... while also increasing workforce diversity." But changing the way we educate people won't accomplish this result. Much broader social changes are needed, not just in the U.S. (where they are engaged in a political struggle over this point) but also around the world.

5. Accountability shouldn’t start and end with for-profit colleges

The author argues that "if we can agree on desired and measurable outcomes... while for-profit schools may need to be held to a higher standard given the potential for abuse, there’s zero logic in letting traditional colleges off the hook entirely." This is based on the dubious premise that traditional colleges ever were "off the hook", which is demonstrably false. In the U.S., there are numerous federal, regional and occupational accreditation bodies. In Canada, colleges and universities are accredited by provincial governments. Other countries have similar requirements.

What the author's argument glosses over is that the existing set of regulatory bodies hasn't been nearly enough to accommodate corruption in the private sector. The profit motive in education (as in health, as in justice, as in government...) creates incentive for dishonesty that doesn't exist in an environment where dishonesty doesn't provide financial rewards.

Nor is accountability itself any guarantee of appropriate behaviour. The U.S. is one of the most regulated economies in the world, yet conflicts-of-interest converts much of that regulation into tools to protect existing markets and to cater to specific lobbies and entrenches interests. I just referenced an article the other day showing how pizza has been classified as a vegetable in U.S. school lunches.

Education is better viewed as a profession with core ethics - akin to medicine, law, accounting, ect. - than an industry depending on legislation and accountability to constrain fraudulent behaviour. That means that the core objective of education has to be something other than the pursuit of profit, otherwise the only ethic is (as it is in the financial service industry) the bottom line.

6. Outcomes should be about “distance traveled”

This is the author's "pizza is a vegetable" moment. "When we measure outcomes, we need to ensure we’re not focusing on metrics that correlate entirely with inputs, but rather on 'value added' by the institution to students."

On the surface the intuition is sound: "providing extra points to institutions with a demonstrated track record of enrolling low SES students and producing strong education and employment results."

On the one hand, this simply replaces one form of institutional cheating with another one. Instead of denying admission to low SES students, the focus turns to 'force marching' them along predefined paths (think: 'special education for poor people'). And because the only measure is 'distance traveled', it remains acceptable to produce graduates who are unqualified in terms of competencies and skills, and in addition bereft of self-management or self-sufficiency skills.

On the other hand, the representation of education as a 'path' is itself fundamentally misguided. I've talked about the weakness of the path metaphor in the past. And I've talked about the key requirement that educators prepare people not to be followers, but to be explorers.

7. Technology is key to improving learning

The gist of the author's argument here is that technology can make the delivery of instruction more efficient. There is a nod to the idea of better outcomes, but the emphasis is on more productive delivery of existing outcomes (and of reducing or limiting educational faculty salaries).

We see this in the reference to the Baumol effect, "a rise of salaries in jobs that have experienced no increase of labor productivity," which is part of the jargon of the productivity movement. As Wikipedia (correctly) explains it, "Baumol's cost disease is often used to describe consequences of the lack of growth in productivity in the quaternary sector of the economy and public services, such as public hospitals and state colleges. Since many public administration activities are heavily labor-intensive, there is little growth in productivity over time because productivity gains come essentially from a better capital technology."

So the point that is 'agreed upon' here is that, in education, human labour can (finally!) be replaced with technology to improve productivity and achieve outcomes more efficiently (where, as we've seen above, outcomes will be measured in 'distance traveled' toward 'competencies' which result in 'employment outcomes').

This may be how education is viewed from the outside, from a corporate, financial and perhaps political perspective, but few people actually employed in education view it this way. Oh sure, we'd like to see our graduates get jobs and succeed economically. But we like to see this as the result of the student's efforts, not as something we merely provided for them. We see it as the capability, growth and self-sufficiency we've provided, rather than as the terminus of our own efforts in the field.

So technology plays a very different role in education than it plays for people talking about education. Technology increases the capacities of educators and helps them focus on the hands-on tasks (which they're never really had time for before) while automating the things that can be automated. Technology held address many of the needless expenses associated with education - like content and content delivery, records management, unwanted and unneeded courses, etc) - and to focus on the real and present needs of students.

The objective here isn't 'efficiency', though it's easy to see why outsiders cast in in this light. It's precision - being able to target our work where it will do the most good for the greatest number of people. Precision isn't simply a matter of hitting a target more often than not (that's efficiency). It's hitting the right target, at the right time, in the right way.

8. Assessments are needed to save the liberal arts

The author's argument here is that students (especially poorer students) have been increasingly turning to "pre-professional degrees" like business, healthcare, education, and technology while turning away from the liberal arts, and that unless schools can actually document the outcomes of liberal arts programs, they "will be increasingly a plaything for rich kids (who’ll use connections to get good first jobs, so it doesn’t matter what they study)."

My own education qualified as liberal arts. I majored in philosophy but took strong concentrations in the sciences, history and geography, and religious studies. As I've often said, there was a sign on the wall in the University of Calgary Philosophy Department warning student not to expect employment as a consequence of a philosophy degree. Despite taking out the maximum in students loans (totaling $25K in 1980s dollars) I didn't care.

Why not? The 1980s were recession years in Canada, and having spent time in industry before my university education I could see first-hand the fallacy of believing that a specific university program would get me a job. It didn't really matter which degree I had; they were looking only for the persistence and tenacity (and wealth and upbringing) that having any four-year degree demonstrated.

And also, I lived in Canada, and we don't starve in the streets just because we're unemployed. I knew that, and I knew that no matter what happened (as I often said at the time) "they can't take my education away from me". Not that they didn't try - the Universities withheld transcripts and collection agencies destroyed my credit. But they couldn't take the knowledge back out of my head - all they did is to create a healthy scepticism and distrust of institutions.

Societies that truly want to 'save the liberal arts' will derisk the pursuit of them. It's not a question of documenting outcomes - the benefits of studying grammar, logic, communications, mathematics, the arts and astronomy are actually pretty self-evident. No really successful person has succeeded without them (even Steve Jobs talks about how important the study of calligraphy was to him). When students take pre-professional degrees, they are saying, in effect, "maybe later, it's too risky now".

9. Follow the money

The author writes, "colleges and universities get paid no matter what." As with some of their previous premises, this is demonstrably false. Colleges close all the time - in the U.S. the 10 year average is five a year. Look at the struggles faced by the University of Phoenix over the last year or so. Look at the decades of declining state funding for institutions in the U.S. The story is told in other institutions in other countries. It is simply false that "colleges and universities get paid no matter what".

The author uses this premise to argue that 'we are agreed' that "the federal government has two choices: it can condition funding on outcomes (à la Gainful Employment) or require schools to put 'skin in the game'" in the form of "risk capital" for each and every student. Forcing institutions to bet on students' future financial prospects would certainly change institutional behaviour. But not for the better. It would convert 'education' into 'venture capitalism'.

I won't get into the problems with this approach in detail. It suffices for the purposes of this post to point out that there is scarcely unanimity behind the proposition that education is fundamentally an economic activity that should be financed the way we finance business and industry. But this sort of perspective should not be surprising coming from 'university ventures'. After all, there's money to be made in 'student IPOs'.

10. Colleges are worth saving (especially the one you attended!)

The author's point is exactly the opposite of the bullet point: "we don’t have enough resources to save every college (or, for that matter, to discharge every student loan)." The point is essentially that not every college can be saved and not every student can be funded. We should "avoid the myopia" that sees our own college as something that "represents the apex of civilization."

It's true. Colleges rise, colleges fall. Civilizations rise, civilizations fall. Even Plato's Academy shut down after a successful 300 year run (or 800 year run, depending on who you talk to).

But there's a difference between observing that colleges and civilizations fail and arguing that we should just stop supporting them. What we should be doing is to preserve the good that these institutions provide society rather than giving up on the enterprise wholesale. A company can be happy to sell its legacy to the highest bidder. A society should not. Yes, there are "natural limits" to almost everything, but this does not constitute an argument for being the agent that applies them.

It's not a question of whether or not colleges and universities are "worth saving". To view the question in such terms is to treat them merely as economic entities and assessing them against their financial value. But they are just vessels. 

What we have, in societies around the world, is a millennia-old legacy of educational institutions as stewards and purveyors of our collective wisdom not as an engine of employment or economic development, but as the reason employment and economic development exist.

In a sense, the role played by the educational system in society is the same as the role played by an education in an individual. It may result in income and employment, but that is not the purpose behind it. It is to help us not only adapt to the winds of chance and fortune but to rise above them, to create our own place in the world as free and fully realized beings, to flourish in every sense of the word.

That's not something a venture capitalist will invest in. But it's something each one of us lives for, each and every day.


Popular Posts