色盒直播

AI will replace academics unless our teaching challenges students again

Delivery of educational material chunked at the optimal grade for retention by passive student-consumers is ripe for automation, says Andy Farnell

January 19, 2023
Albert Einstein head in a  humanoid prototype to illustrate AI will replace academics unless our teaching challenges students again
Source: Getty (edited)

These past months have seen a wave of articles about a?new kind of?AI called large language models (LLMs), of?which ChatGPT is?the most prominent.

The liberal progressives have embraced?it, declaring that its ability to?auto-generate plausible-sounding text in?seconds strengthens the case for “authenticity” in?assessment. By?contrast, the conservatives are doubling down on?detection, proctoring and reference-checking. But both camps are missing the bigger question: if?you cannot tell a?machine from a?genuine student, what makes you think a?student cares whether they’re taught by?you or?a?machine?

Critics are quick to seize on ChatGPT’s limitations. By averaging a colossal training set of 1TB of text, containing 175?billion word-association parameters, it responds to question in the style of an overconfident sixth-form essay, with a strident, repetitive tone. But truth is not a reliable outcome of this process, as many examples have highlighted. LLMs lack even the most basic epistemic position and any understanding of causation or structure.

Yet that doesn’t matter, advocates argue. This is just a start. LLMs, notable because they are superficially human, have yet to integrate with other kinds of AI in this rapidly maturing field. Once they do, they will be capable of effective reasoning – or its useful simulation.

色盒直播

ADVERTISEMENT

The likely impact of LLMs on labour markets is certainly underestimated. Application researchers are already eyeing them for tasks such as retail, customer assistance, query and decision support. And their impressive ability to interactively deliver short, informative responses, more succinct and focused than Google or Wikipedia, makes them strong candidates for teaching, too. Using current technology, adding speech systems for accurate listening and expressive voice synthesis is almost trivial.

About ?5 an hour currently buys computing power to service five to 10 students simultaneously. Surely administrators will be falling over themselves to replace teachers with bots trained in specific knowledge areas – perhaps incorporating Douglas Adams-style “personality” add-ons, allowing students to be taught by simulated Einsteins or Feynmans.

色盒直播

ADVERTISEMENT

But as the cost falls, and as people become increasingly unable to distinguish plausible-sounding nonsense from genuine wisdom, human suppliers will compete with machines in a race to the bottom. And the disastrous consequences will not be limited to academics’ bank balances. As a systems theorist, I?predict that this market for lemons, as economists call it, will run into the same problem that devastated banana production in the late 20th century: production of sterile monocultures. Positive feedback loops of mediocrity will kill off intellectual progress by failing to reproduce innovative experts with core disciplinary skills. At?best, we will be stuck in an endless recycling of “approved facts”. At?worst, our ability to reason and assess knowledge claims will collapse, leave us sitting ducks for recruitment and brainwashing by malign forces.

In fact, you could argue that we are nearly there already. Credential inflation means that a degree is now considered a necessity, and many students are not so much thirsty for knowledge as anxious about being left behind in the red queen’s race to grow their CVs. Profit-hungry universities’ response to this market has, at the extreme, reduced professors to poorly paid operators of degree machines that chunk educational material at the optimal grade for retention by passive student-consumers. Such a model is ripe for automation and sublimation by LLMs, whose training data can be washed of anything too marginal.

To save both our jobs and society, academics must go back to the future. At our best, we did not use to spoon-feed students, as we do today. Nor is that what they wanted. They came for guidance, encouragement and socialisation into collegiate life, already as full adults. We always had an implicit duty of care, but we also had authority to direct and judge.

In my tutorials in the 1990s, anyone who sat in meek, expectant silence was ignored – by me as much as by their peers. Something important I?took from a psychoanalyst is that “this doesn’t begin until you have the courage to speak first”. A good professor is a sparring partner who aims to toughen up young minds through merciless examination of purported knowledge, a process continuous with research. And I?believe many students still crave to be challenged – even if they have lost the words to express?it.

色盒直播

ADVERTISEMENT

There is unquestionably a demand for passive inculcation, too. But in a world bereft of benevolent intellects, trusted, respected human knowledge curators will command a premium like never before – possibly in new freelance roles akin to those of consultants or therapists.

We will need courage to retake the ground freed by the machines. Begging bowls in hand for grants and accolades, we’ve to be bullied by MBAs with twice our salaries and half our?IQs. But once we accept that AI will completely replace our current jobs as mouthpieces for statistically “consensual truth”, we can begin valuing ourselves again.

By re-embracing our former status as authorities and role models, we will become real professors again.

Andy Farnell is a visiting and associate professor in signals, systems and cybersecurity at a range of European universities.

色盒直播

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

Use of anything in excess and trying to replace the neural networking of a human brain is bound to be detrimental to neural growth at large.

Sponsored

ADVERTISEMENT