In an age marked by heavy reliance on technological advancements, students are navigating a new academic frontier, one that involves AI-powered chatbots and essay mills. As we delve into this concerning student cheating phenomenon, we uncover the rise of deceptive ads on popular platforms like TikTok and Meta, bringing to light a disturbing alliance between technology and academic dishonesty.
The TikTok-Meta Connection
Recent investigations have unveiled a disconcerting trend where dubious advertisements promoting AI-driven student cheating services have infiltrated TikTok and Meta, raising ethical and legal concerns. Fast Company’s inquiry into this matter prompted swift action, with some of these ads being taken down.
The issue, however, goes beyond advertising policies, delving into a complex web of AI ethics and academic integrity.
AI Chatbots in Education
This academic year marks a significant shift as students gain easier access to AI-powered chatbots like ChatGPT. While these virtual assistants offer quick output generation, the technology is far from infallible.
Researchers have identified a prevalent issue known as “hallucinations” in chatbot responses, with some suggesting that as much as 20% or one in five citations generated by GPT-4 may be fabricated.
The Rise of Essay Mills
Capitalizing on the unreliability of AI chatbots, essay mills are on the rise. These services blend AI and human labor to create academic content that evades detection by anti-cheating software.
A recent analysis published in the open-access repository arXiv reveals that such mills are actively soliciting clients on TikTok and Meta platforms, despite being illegal in several countries, including England, Wales, Australia, and New Zealand.
Furthermore, according to Michael Veale, an associate professor in technology law at University College London, platforms like TikTok and Meta may be violating the law by advertising these services. He discovered the extent of this issue while examining ad archives in response to the EU’s Digital Services Act, which aims to increase transparency regarding advertising on major tech platforms.
These advertisements were purchased by various companies offering a range of essay writing services, with 11 different AI services identified in the study.
TikTok has responded by removing the flagged videos and banning the associated accounts for breaching their advertising policies. A TikTok spokesperson emphasized in a statement to Fast Company that while ads promoting AI applications like ChatGPT are allowed in certain cases, misleading or dishonest advertisements are strictly prohibited.
In contrast, Meta did not provide a comment on the matter, and some of the reported ads remain visible, albeit inactive, on their ad transparency platform.
The Challenge of Enforcement
Veale underscores the challenge of enforcing these vague laws, suggesting that tech giants must decide how broadly they will enforce regulations in response to these potentially illegal services. This ambiguity could have wide-reaching consequences, potentially affecting general-purpose AI systems and assistive tools.
A Growing Issue
Academic integrity specialists like Thomas Lancaster at Imperial College London acknowledge the growing problem of AI in cheating. Both students and contract-cheating providers are increasingly turning to AI for their needs.
As AI technology becomes more commonplace, it becomes increasingly difficult to detect its usage reliably.
As AI continues its relentless march into every aspect of our lives, the battle against cheating and dishonesty in education is becoming more challenging. The proliferation of AI-powered services on platforms like TikTok and Meta underscores the need for both legal clarity and responsible platform governance.
While students grapple with the allure of AI-powered shortcuts, the question remains: Can we effectively address this issue in the evolving landscape of technology and academia?
Giancarlo is an economist and researcher by profession. Prior to his addition to Blockzeit’s dynamic team, he was handling several crypto projects for both the government and private sectors as a Project Manager of a consultancy firm.