Eliminating harmful digital technologies in universities: a guide
Modern institutions are rife with tech that disenfranchises, dehumanises, excludes and even bullies students and teachers. It鈥檚 high time for a rethink, says Andy Farnell
I was recently asked: 鈥淲hich digital technologies could we get rid of in higher education?鈥
Some immediately spring to mind, such as the scourge of CCTV cameras and badge access systems, which are turning places of learning into high-security gulags. Or, at the behest of government bureaucrats, our draconian monitoring of student attendance like preschool infants. But these technologies, unwelcome and unnecessary as they are, do not capture the problem 鈥 which is that of equity.
Every part of an equitable university is accountable and responsive to its core stakeholders 鈥 students and teachers; those without whom the entire institution makes no sense. Within their activities each must be able to teach and learn as a fully human participant, to be genuinely heard, held in mind, have choice, agency, autonomy and equality of opportunity.
- AI in higher education: dystopia, utopia or something in between?
- Digital exclusion hits students hardest at the start of their journey
- Is your teaching and learning 鈥榥ot supported鈥?
Since every aspect of teaching and learning is touched by technology, naming specific problem technologies for elimination is akin to asking which limb we ought to amputate 鈥 for a patient with a virus. So we must reframe the question. Technology can deliver cheap, fast, efficient, uniform, accountable and secure education. But systemisation carries a catastrophic cost that falls upon students and teachers. So, let us ask: what types of harm are linked to technologies so we might design and/or select better alternatives? How do we eliminate those products and services that cannot, or will not, perform desirable functions without attendant burdens?
Harm occurs when technologies divert equity away from key stakeholders toward powerful but marginal stakeholders, namely chancellors, trustees, directors, dignitaries, landlords, governments, industries, advertisers, sponsors, technology corporations, suppliers and publishers. Harms arise because these entities have become invested in pushing technologies that favour their products and interests into the education ecosystem.
Obviously, we can鈥檛 entertain the idea of removing all technologies from education, if only to dodge the pedant鈥檚 retort that we鈥檇 better burn all books and blackboards while we鈥檙e at it. Rather than looking for technical errors, let鈥檚 recognise that technologies are , which lead to misuse.
As a brief summary, we wish to identify and eliminate systems that:
- disenfranchise and disempower
- dehumanise
- discriminate and exclude
- extract or seek rent
- coerce and bully
- mislead or manipulate
Disempowering technologies
People unable to 鈥渒eep up鈥 with technology are disempowered. Those seeking to disempower only need follow Mark Zuckerberg鈥檚 call to 鈥渕ove fast and break things鈥. For example, , until a huge backlash . Touted as 鈥渟ecurity鈥 improvements, the updates, for example to 鈥鈥, just handed more control of the owner鈥檚 PC to Microsoft. By contrast, the latest releases of Linux happily run on much older computers without entitled seizure of the owner鈥檚 operational sovereignty.
Similarly, incompatibilities are suddenly introduced by vendors into newer software. Google famously discontinues services on which millions depend. Take a solemn stroll through the and see if any headstones evoke a tear. University IT centres expose students to risks by choosing software from companies with poor track records for long-term stability, equal access and interoperability. Students suddenly find their education is 鈥渘ot supported鈥.
Systems that dehumanise
To dehumanise is to ignore or minimise individual identity, erode empathy and enforce uniformity. Since the 1990s, students have been 鈥渂ums on seats鈥. Digital technology simultaneously connects people and puts distance between them, removing proximity and the rich reality of interpersonal communication that demands a higher level of respect. Dr Andrew Kimbrell terms this deflation of responsibility 鈥鈥. As examples, 鈥渋ssue ticketing鈥 used in customer support systems and 鈥渘o-reply emails鈥 (those infuriating emails that will not allow you to reply) both silence voices and stunt discourse and are typical dehumanising devices.
Increasing use is made of unaccountable algorithms to automatically shut people out of systems when their 鈥渂ehaviour is deemed suspicious鈥. People who deploy algorithms should be held personally responsible for the harms caused, as if they had acted by themselves 鈥 rather like dog owners. On the contrary, as Cathy O鈥橬eil points out in her book, , blaming the victims of toxic IT systems for falling foul of invisible 鈥減olicies鈥 is the norm.
Systems of exclusion
The cashless canteen is as effective at starving students of food as overzealously locked down wi-fi and audiovisual equipment is at preventing lecturers from teaching. Exclusion begins with assumptions that are silently transformed into mandates. As a regular visiting professor, I make sure to pack a flask of coffee and lunchbox alongside my 4G wi-fi dongle and mini-AV projector. , specifically designed to sidestep such parochialism, is often disabled. Universities are hostile places unless you鈥檙e part of the 鈥渋n crowd鈥, and that needs to change.
Furthermore, as big tech monopolies take over education, access to essential services without 鈥渟igning in鈥 using Facebook, LinkedIn, Microsoft or Google accounts is getting harder. Those who don鈥檛 subscribe to any of those are locked out without alternative provision or apology. Blunt web censorship based on common keywords alienates research students investigating inconvenient subjects such as terrorism, rape, hacking or even birth control. We must re-examine the power to shape academic life accidentally handed to non-academic faculties such as ICT, security and compliance teams. Surely, censoring and monitoring technologies characteristic of police states have no place in institutions of free enquiry and exploration by intelligent adults.
Systems of extraction
Rent-seeking software such as survey tools that hold research data hostage until the student pays a 鈥減remium fee鈥 are encouraged in universities that lack the skills to set up their own basic HTML forms. Data harvesting is performed by tools such as Turnitin, which requires students to sign over rights to their work, and single sign-in frameworks that leak browsing habits. Tracking, attention monitoring and targeted advertising is part of campus life.
Let us now 鈥 although may be too harsh a step change. Alternatives require skills and education. Instead, let鈥檚 at least mandate choice, so that those who choose, and are able, to extricate are free to do so. Universities that force Google or Microsoft products should lose government backing for being nothing more than extensions of the US corporate estate.
Systems of coercion
Threats hardly seem appropriate for a progressive learning environment, but for decades I have taught inconsolably anxious students mortified by attendance reports, submission systems and other machinery that sends nagging notifications, not to mention spurious or false warnings. The more we automate the student experience the more brutal it becomes. Universities living in fear of losing their licences for must dial back their overcautious machinery. We must realise the impact on mental health of students who genuinely believe that a faulty algorithm may put them on a plane to Rwanda is not an acceptable price for over-compliance.
Common aims
Many inappropriate technologies blight higher education because we do not understand it. Solutionism, knee-jerk mentality and a penchant for cheap, quick, off-the-shelf fixes is rife. We lack a coherent, joined-up understanding of the trade-offs; psychological, political and pedagogical.
Change begins with raising the skill levels and issue awareness of strategic, policymaking and ICT staff, and generally improving the digital literacy of all academic staff, if we are to shrug off our unhealthy default fallback on convenient but inappropriate technologies. It is time to make the voices of the most important stakeholders 鈥 students and faculty 鈥 heard again and to remedy the profound dearth of equity in technology selection and procurement.
Andy Farnell is a British computer scientist specialising in signals and systems. He is a prolific speaker, visiting professor, consultant, ethical hacker, and lifelong advocate for digital rights. He is teaching cybersecurity while writing a new book, Ethics for Hackers.
If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, .