Two years ago, the accountancy firm EY made an announcement that no doubt sent a shiver down many lecturers’ spines. After failing to find any published evidence that graduates with good degree results made for better employees, a trawl through its own data, the company revealed, similarly found “no evidence to conclude that previous success in higher education correlated with future success in subsequent professional qualifications undertaken”.
Instead, what predicted success at EY, defined by recruits’ performance in annual appraisals and accountancy exams, is a “mix between behavioural and cognitive attributes”, explains Dominic Franiel, head of student recruitment at the company. These attributes include many things that higher education is supposed to instil. Among them are logical thinking, ability to understand the root cause of a problem, rapid comprehension of new concepts, self-motivation, a confidence-inspiring and professional manner and a strong work ethic, Franiel explains.
Acting on these findings, the company, which recruits about 900 UK graduates alone each year, scrapped its requirement for applicants to its graduate scheme to have an upper second-class degree and 300 Ucas points (equivalent to three B grades at A level). They still need a degree, and the company “still values academic achievement”, says Franiel. But rather than relying on university grades, EY now assesses applicants using its own bespoke tests. And this has had a noticeable effect on the kinds of people being successful – with a of graduates who went to state schools and who were the first in their family to go to university, says Franiel. Nearly one in five new recruits would have been excluded under the old system for having low grades, he adds.
EY’s move is part of a growing trend. Another UK-headquartered accountancy firm, , also scrapped its 2:1 requirement in 2013. In the same year, ’s internal research discovered that college grade-point averages and other test scores were “worthless” in predicting future success at the company, and scrapped its requirement for applicants to submit detailed results for everyone except brand-new graduates. And last year, the publisher dispensed with the need for a degree altogether, saying that there was no link between having one and workplace performance. According to Stephen Isherwood, chief executive of the UK’s Association of Graduate Recruiters, “there’s a growing recognition that using academic grades as a kind of binary cut-off, as a cliff edge, is no longer [effective]”.
Few would argue that universities should simply be factories for the production of trainee accountants able to fit seamlessly into the corporate world. Higher education is clearly about much more than that. But there is increasing scrutiny of graduate outcomes by both governments and students (the UK’s teaching excellence framework and Longitudinal Education Outcomes project are good examples). If higher education is not seen to be adding any value to graduates’ skill sets, that has clear implications for university enrolment and funding levels.
So what value is higher education supposed to add? And how is this different from what school or vocational education offer? When challenged on this, the stock response from university leaders is “critical thinking”. Although the term is rarely defined in great detail, it is understood to involve an ability to think independently and to question assumptions in a structured, logical way. And, quite apart from the usefulness of such an ability in the professional sphere, it is also seen by many as a crucial element of informed citizenship. Insufficient levels of it among the electorate have been blamed by some for the success of last year’s campaigns for the UK to leave the European Union and for the US to elect Donald Trump as president – which, opponents say, made claim after claim that bore scant relation to logic or truth.
But there are two related questions around this. One is whether firms themselves really value the kind of critical thinking that academics prize. The bespoke tests that they are increasingly using to sift job applicants suggest that the kind of contemplative, expansive anatomisation of arguments that leads to high essay grades may not be what is required by successful operators in the modern, time-pressed workplace.
A further question is whether even the academic brand of critical thinking is being particularly well taught at university. According to Bryan Greetham, a philosopher and university researcher who has written several books on how students and professionals can improve their thinking, “We tend to want to do the simple thing – which is to teach students what to think, not how to think.” And it has long been an open secret in higher education that the sharpening effect, however defined, of a university education on students’ minds is far from well evidenced.
This was most famously explored in the 2011 book Academically Adrift: Limited Learning on College Campuses. The authors, American sociologists Richard Arum and Josipa Roksa, found that 45 per cent of US undergraduates failed to significantly improve their critical thinking, complex reasoning and writing skills during their first two years at university. Other US-based studies have raised similar concerns. One from 2009, “”, published in the Journal of Experimental Education, warned that college and high school students have “difficulty evaluating arguments on the basis of their quality”.
But definitive studies of the issue are still lacking. There is, for instance, no university equivalent of the Organisation for Economic Cooperation and Development’s Pisa tests for schools, which measure reading and numeracy skills at different ages across countries. When the OECD recently trialled a ” of graduates, it was effectively scuppered after opposition in the UK, the US and Canada.
So, with the credibility of their credentials questioned, what can universities do to ensure that their students graduate with better cognitive skills? How can they live up to their claim that critical thinking is their unique selling point?
Some have begun explicitly teaching analytical skills. For instance, Anne Britt, a psychologist who has conducted studies into whether students can comprehend, evaluate and write arguments, runs a course at Northern Illinois University called simply “thinking”. After just half an hour of instruction, she says, most students do show significant improvement – although there is also a significant minority who make little progress even after a term.
To get the hang of certain ideas, she says, requires plenty of contact time between student and teacher. “I have many students who come in and think they’re excellent at argumentation. In fact, they’re not. Early on, they need feedback because they conflate argumentation with giving an opinion,” she says.
But if universities don’t have the resources to offer intensive classes, could they weave the teaching of critical thinking skills into regular teaching? Britt thinks that academics can easily make time for quick “check-ins” during their lectures to ensure that their students understand what they’ve been told. “It doesn’t have to take long. It could take 10 minutes at most…otherwise students are not actually thinking during the lecture,” she says. A quick intervention can make students realise: “Oh! This is how I’m supposed to be reading my book. These are the kinds of questions I’m supposed to be asking.”
High school experience, of course, varies enormously by country. In France, studying philosophy – arguably the closest that traditional disciplines get to explicit critical thinking courses – . In England, meanwhile, the critical thinking A level has . And Robert de Vries, a lecturer in quantitative sociology at the University of Kent, became “convinced pretty quickly” that many UK students need “explicit, remedial instruction in these abstract skills. I get the sense that students are used to being marked for content – ‘Have you covered this topic? Have you mentioned this fact from the textbook?’ – rather than for the quality of their reasoning or argumentation.”
However, according to de Vries, university courses “often can’t devote the time needed to explicitly teach the abstract tools of critical thinking: how to construct a good argument, how to spot weak evidence for a claim. They have a lot of substantive content to cover, and, to an extent, they have to assume that students will have already picked up a lot of this stuff by the time they get to university.”
This state of affairs, he says, explains why he too teaches a specific critical thinking course, which is compulsory for Kent’s sociology, social policy and social research students, and can be taken as an optional module by other students.
Another problem with assuming that students can pick up critical thinking skills during their normal studies, Britt adds, is that each subject leaves them with very different ideas about how to argue. In contrast to the scientific approach, “in history, I might never have a graph of data”, she says. Hence, when graduates start work, a historian and a scientist may begin with very different concepts of what constitutes reliable evidence, she says. So the idea that all university graduates have a generic ability to think critically may be somewhat misleading.
Moreover, widespread doubts exist that critical thinking is the be-all and end-all of employability. As previously mentioned, employers additionally value certain attitudinal traits, which their aptitude tests also seek to test. And, according to Greetham, critical thinking is not enough to enable graduates to do what their employers prize above all: the ability to come up with new ideas and concepts, and to create solutions to problems.
For him, critical thinking is useful, but it “works on the assumption that facts, right answers and certainties are out there just waiting to be discovered” by logical thinkers. “It assumes that a teacher’s role is to find these facts and transmit them, while teaching students the skills to chip away all those things that obscure them, [such as] the inconsistencies in our reasoning, irrelevant arguments and unsupported assumptions.”
But, this, Greetham says, is a flawed picture of reality. Rather than creative thinking, he says, students need to be taught what he calls “smart thinking”. This requires lecturers to allow far more discussion in class, and to guide students in how to analyse and synthesise concepts. “You need to lose control in the classroom in a way – because you’ve got to say: ‘Let’s have your ideas,’” he argues. Instead of requiring students to present their ideas to the class, followed by a discussion, lecturers need to get students to analyse the meaning of concepts, “recording their ideas on a whiteboard as they shape and reshape them”, and guide them in “reinterpreting and adapting their solutions”. This kind of teaching, he maintains, genuinely develops creative and conceptual skills.
But, Greetham warns, such change is unlikely to be possible while academics continue to be recruited on their research record, as opposed to their teaching ability.
As well as calls for critical thinkers and smart thinkers, there are also from politicians for more “entrepreneurial” university graduates – who, instead of joining graduate recruitment programmes at large employers, might start their own businesses. Exactly what this means is hard to pin down, but it generally involves an emphasis on less theory and more practical experience.
Indeed, there is a general move across the world to align higher education more closely with vocational training. Nowhere is this trend more apparent than in Germany. “Years ago, the two education sectors were following different, opposite, logics – scientific versus practical orientation – and were addressing strictly separated job markets,” says Ulrich Mueller, head of policy studies at the Centre for Higher Education, a German thinktank. “The situation now is different: there is a big overlap between academic and practical education.” There are now very job-focused academic courses, he says, as well as vocational programmes with lots of academic content. The overlap is such that graduates of these two streams now compete for the same jobs in some areas, such as medicine and computer science. “I am sure that in 15 or 20 years there will be an [integrated] system of post-secondary education,” Mueller concludes.
In such a world, of course, there would be no need for higher education to define itself in a particular way, in contrast to other forms of education. But in the meantime, pinning down the unique skills of university graduates remains moot – especially when accessing higher education can be so much more expensive than vocational alternatives.
It is possible to overstate the level of concern about the issue. The US National Association of Colleges and Employers, for instance, says that it hasn’t seen a shift away from degrees towards employer-designed tests, suggesting that US firms largely still see college GPA as a reliable indicator of the kinds of skills they are looking for. And, in the US, getting a college degree is arguably in terms of being competitive in the job market.
Yet, paradoxically, the bigger the graduate cohort becomes, the more employers are likely to question exactly what, if anything, having a degree really indicates. And that, in turn, could require universities to ask hard questions about how much – or how little – their courses really change how their students think.?
Think fast: what do graduate recruitment tests actually involve?
Having taken several of the bespoke tests that firms are now using to put their graduate applicants through their paces, it’s probably fair to say that I?shouldn’t give up the day job.
But I?suspect that I?am not the only degree-holder who would struggle: the tests assess skills rather different from those required to pass university exams. Some questions involve numbers, shapes or text, but all are broadly designed to gauge mental agility. In abstract reasoning tests, for example, a typical question features a sequence of shapes, which you are asked to continue. Circle, square, circle, square: what next? That’s easy – but the patterns quickly get a lot more complex than that.
Another spatial reasoning test designed for recruiters by specialist firm Saville Assessments involves rotating 3D shapes in your head to see which is the odd one out. Meanwhile, a verbal reasoning section requires you to speed-read a passage about eating habits – putting any emotional or critical reaction out of your mind – and then click on the best summary of it, or choose the best synonym to replace a word. Worryingly for a journalist, I?completely flunk error-checking, which involves scanning a spreadsheet at a furious pace.
It is the rapid-fire nature of the tests that most distinguishes them from university work. You rarely have more than 30 seconds to answer a?question; in some cases, you have little more than
10 seconds. It’s a different form of mental activity from, say, crafting a?dissertation over many months.
But perhaps the skills that these tests assess – can you quickly skim a paragraph or a screen of numbers, and then fire off an acceptable answer? – are more useful
in the modern, time-pressed office than the sustained thought that higher education is supposed to -inculcate.
That said, it is important to note that the tests aren’t used in isolation. Situational judgement tests, where applicants are presented with a real-world dilemma, are also increasingly in vogue. According to Dominic Franiel, EY’s head of student recruitment, a candidate might be asked, for instance, how they would handle a situation in which a more senior colleague was suddenly called away and they were left having to meet clients. Given a list of five choices, “there’s probably an ideal answer”, but it’s “quite nuanced”.
Moreover, applicants who get past the first online round do then have to weigh evidence and formulate their own opinions at a more leisurely pace at assessment centres, he adds.
It is also worth noting that the testers make no pretence to be assessing raw ability. They readily admit that scores can be enhanced by effective prepping – which I?did not do. So perhaps there still hope for more ponderous graduates like me after all.
David Matthews
后记
Print headline:?Fuzzy logic