In February, Reuters reported that ChatGPT registered an estimated 100 million monthly active users in January, making it one of the fastest-growing consumer internet applications in history.
The figures meant the platform was able to outpace the initial user engagement of popular social media platforms like Instagram and TikTok in a matter of months.
Using the machine learning technique of reinforcement learning from human feedback, ChatGPT’s website said the platform is trained to follow an instruction in a prompt from a user, provide a detailed response, and collect data based on the interactions to enhance its response capabilities overtime.
But, while ChatGPT has become the most recent face of artificial intelligence (AI) tools, it isn’t the first or only one.
During a UWI Vice-Chancellor’s Forum in May titled “Artificial Intelligence – A Blessing or Curse for Higher Education”, education experts said that AI tools have gotten more popular and easier to access with technological advancement.
Acknowledging that AI is here to stay, they called on educators and educational institutions to focus on preparing students to comprehensively understand and ethically use it.
Patti West-Smith, Senior Director of the Customer Engagement at US-based anti-plagiarism tech firm Turnitin, said educators now “have an obligation to equip students to really understand what these tools can and cannot do, and what purposes they serve”.
She explained, “Over the last couple of months, we’ve learned that the world of work ahead of them (students) will almost certainly require them to use these tools. So, if we are not arming them with the right understanding, and experiences to be able to use these tools effectively and ethically, then we are really short-changing them.”
Apart from ensuring students understand AI, West-Smith said unequal access to these tools can also further the digital divide and disparities in learning. She called on educational stakeholders to ensure all students have access.
Though West-Smith thinks that AI offers opportunities to rethink the processes relating to writing and assessment, Arianna Valentini cautioned educators to ensure their students are aware that AI tools like ChatGPT are inherently biased like all technologies which rely on collecting human data.
A research and foresight analyst at The UNESCO International Institute for Higher Education in Latin America and the Caribbean, Valentini said AI tools can’t replace the value of human discourse, the importance of writing skills to educational development, and critical thinking.
She added, “[AI] depends on the information and how it has been programmed. So, the biases that AI can have include gender biases, and they can be geographically biased to the Global South and North.”
Dr Margaret Niles, manager of research insights and products innovation at the Caribbean Examinations Council (CXC), said the education system must determine how it will exist alongside AI tools by re-evaluating and recalibrating the teaching/learning transactions to accommodate rather than fear them.
She said, “At the council, we accept that ChatGPT is here and it will come in different forms. We are not fearing it.”
Dr Niles added, “We do understand that we have to focus a lot on authentic assessment, and that our stakeholders will now need to be very vigilant as we develop the strategies and policies to support the integration of ChatGPT across our syllabi and curriculum.”
However, she pointed out that educators must now put guardrails in place to preserve soft skills like critical thinking and creativity while ensuring students fully understand the true meaning of academic integrity.
Valentini echoed Niles’ sentiments on re-evaluating learning styles, and said that AI can be a springboard into discussing the development of competency-based curriculums and assessments versus content-based curriculums.
She said, “Everyone has mentioned we are going to be needing critical thinking, entrepreneurship skills, and all these transversal skills that might be easier to reach through a competency-based curriculum.
“This can also help us design new assessments that are not going to be easily replicable with artificial intelligence.”