Is GPT-5 Important for Education?

You might have noticed in the media that OpenAI released a new version of their Generative AI chatbot, ChatGPT last week. GPT 5 has been two years in the planning and was heavily hyped. The reception was at best muted, probably because of all the hype especially from CEO Sam Altman. In recent moths he has repeatedly claimed that we were on the verge of General AI, Ai that could do almost anything better than humans. And prior to the release of GPT 5, which powers the Open AI Chatbot, he claimed it would be like having someone with a Doctorate in every subject. Clearly it is not. It is at best an iterative improvement on GPT 4 but still suffers from most of the problems Faced by Generative AI, including the propensity to ‘hallucinate’. The backlash from OpenAI fans was interesting too. Chat GPT5 is supposed to switch automatically between different Open AI models but many people were unhappy. It seems one of the reasons which led to a successful campaign to restore manual access to GPT 4.O was that the new model was lass sycophantic. A significant number of people liked the flattery that they received from the 4.O chatbot, many calling it their friend.
What are the implications for education and training? Although the new EDU model is yet to be released, probably little. Although Chat GPT has a very large appeal to students as it is free, the functions for learning have not changed.
It might be seen as the leading Generative AI programmes, Claude, Google, X and OpenAI and the like have not been designed with any pedagogic considerations invo0lved. Indeed, the reduced agency for students and for teachers weighs against their value for teaching and learning.
Having said that, despite the major selling line for Gen AI designed for teachers being a reduced workload (although even this is contested), research and commentary about AI in Education is increasingly focused on pedagogy and epistemology. George Siemens says “education is a natural ground for AI since our concern is the development of knowledge. Google Anthropic OpenAI have all announced applications that are dialogic or socratic in nature. If open course ware scaled content, MOOCs scaled instruction, then it looks like AI is going to scale engagement and tutoring.” He nalso draws attention to Ronald Barnett having stated that the growing complexity of knowledge means that we shift from epistemology to ontology in framing the intent and practice of education. He continues that “A flipped classroom approach to learning with AI then requires that we do general epistemological work in collaboration with AI as an active tutor and then devote our time in person time to more ontological or beingness attributes of education since these are harder to scale with AI.”
I think this is all still somewhat speculative. And although I like the increasing scrutiny over the use of AI for sharing knowledge, there is still little attention to what AI could contribute to learning through practice. But the limited advances in the latest releases of large Language Models and chatbots like ChatGPT, probably confirms what Gary Marcus has long claimed that ever larger LLM models will bring only limited improvements. One implication of this is that to justify their huge investments the big tech companies will turn to selling AI for replacing workers, rather than on developing AGI. And attempting to increase productivity through GenAI will impact on the tasks and competences required in different occupations.
About the Image
While humans employ AI, we still exist within and depend on the natural world. Nature provides the raw materials (minerals, water, energy) that sustain both human lives and the functioning of technologies. The image is slightly distorted by digital artefacts to visually represent the tensions, dependancies, and conflicts between nature and AI development.