The market for lemons in higher education:
What fake bibliographies reveal about AI and credibility
By Richard Sebaggala
(PhD)
In economics, one of
the most enduring insights is that markets collapse when information
asymmetries exist. George Akerlof’s “Market for Lemons” has shown how buyers
who cannot distinguish between good and bad used cars distrust the market as a
whole. The credibility of the seller becomes crucial. Once trust has been
eroded, assurances are no longer enough to restore trust. The seller must show
with words and deeds that they know more than the buyer and use this knowledge
responsibly. Education is also a market, even if it is not always seen in this
light. Professors sell specialised knowledge, and students are the consumers.
The same problem of information asymmetry now arises with the use of artificial
intelligence in teaching.
A recent case at the
National University of Singapore illustrates this problem. A professor assigned
his literature students a reading list of six scholarly works. The students
tried to locate the references but realised that none of them existed. The list
had been compiled from a machine-generated bibliography. When the professor was
confronted with this, he admitted that he “suspected” that some of the material
came from an AI tool. At first glance, the incident seemed insignificant, as no
grades were affected and the exercise was labelled as “optional” However, from
a business perspective, the consequences were serious. The relationship of
trust between professor and student was weakened. Students realised that even
those who set the rules for the use of AI did not always know how to use the
technology responsibly.
The irony is clear.
Professors often warn students against outsourcing their learning to AI, citing
the danger of hallucinations, fake citations or shallow thinking. But the
professor who published a reading list of non-existent works made the same
mistake. When the gatekeeper is unable to distinguish fact from fiction in his
own assignments, students rightly question his authority to penalise them for
similar transgressions. The situation is similar to that of a car dealer who
asks buyers to trust his inspections but fails to recognise defective vehicles.
In the long term, such failures undermine the credibility of the entire market,
in this case higher education itself.
Economists also speak
of signalling. People and institutions send out signals to create credibility.
A degree signals competence; a guarantee signals trust in a product. Professors
signal expertise through carefully designed syllabi, carefully constructed
reading lists, and rigorous assessments. When students discover that a reading
list is nothing more than an unchecked AI output, the signal is reversed. What
should have conveyed care and competence instead conveys carelessness and
over-reliance on poorly understood tools. The signal spreads: When a professor
makes such a mistake, students will wonder how many others also rely on AI
without educating themselves about it. If the experts appear confused, why
should the rules they set be legitimate?
The economics of
education depends on credibility. Students cannot directly test the quality of
teaching the way they can test the durability of a chair or the performance of
a phone. They have to trust their teachers. The value of their tuition, time and
intellectual effort rests on the assumption that professors know what they are
doing. This assumption is a fragile contract. When AI is abused, the contract
comes under pressure. The information asymmetry is no longer just between
professors and students, but also between the people and the technology that
both groups are trying to control. If professors are unable to demonstrate
their expertise, their advantage dwindles. The mentor runs the risk of becoming
a middleman who could be displaced by the tools he or she does not know how to
use.
This is why the debate
about AI at universities cannot be reduced to prohibiting or controlling its
use by students. The future will require AI skills and universities should
recognise this. Professors have a responsibility not only to set rules, but also
to model responsible use. This requires checking sources, cross-checking
results, disclosing the use of AI and explaining its limitations and strengths.
Just as central banks maintain market confidence by consistently demonstrating
expertise, professors support the learning market by showing that they can use
these tools with care and transparency.
The episode at NUS is
more than just a minor embarrassment. It shows that the teaching profession
risks losing credibility when those who are supposed to guide students appear
unsure, careless or inconsistent in their use of technology. Students notice the
double standard. They see that their own use of AI is strictly regulated while
professors’ experiment without consequence. They hear over and over that
critical thinking is important but are given assignments based on untested
material. They are told that integrity is essential, yet they see the lines
blurring. Economics tells us what happens as a result: Trust declines and the
value of exchanges between teachers and learners diminishes.
To avoid this outcome,
universities need to advocate for AI literacy rather than bans. Professors
should lead by example and signal through their practise that they can guide
students responsibly. This is not just a technical issue, but one of institutional
credibility. Without it, the education market risks a similar loss of trust as
Akerlof’s used car market. Students may begin to question why they should trust
their teachers at all when the signals are inconsistent and the asymmetry so
obvious. When that happens, the value of higher education itself is diminished
in a way that is far more damaging than a single incorrect reading list.
To think like an
economist, one must shed illusions about authority and examine the incentives
and signals at work. Professors cannot warn their students about AI while
abusing it themselves. They need to understand that credibility is a currency
in the marketplace of learning. Once squandered, it is very difficult to
regain.
No comments:
Post a Comment