Learning to think in the age of AI: lessons for vocational education from the new UNESCO Courier

The latest issue of the UNESCO Courier asks a deceptively simple question: do we still need to think? At one level, the answer is obvious. Of course we do. Yet the fact that the question now has to be posed tells us something important about the moment we are in. Generative AI has arrived in education not simply as another digital tool, but as a technology that appears to challenge some of the core purposes of teaching and learning. It can generate essays, summaries, lesson materials, quiz questions, code, translations and feedback at speed. In doing so, it encourages a familiar temptation: to confuse the production of text with the development of understanding.
What makes this issue of the Courier particularly useful is that it does not approach AI in education as a narrow question of classroom efficiency or cheating. Instead, it brings together a series of articles from different parts of the world to ask what AI means for learning, language, pedagogy, assessment and the work of teachers. For those of us involved in vocational education and training in Europe, this wider view is valuable. VET has always had to balance knowledge, practical competence, social participation and preparation for working life. That is precisely why the current AI debate cannot be reduced to whether students use chatbots to complete assignments.
In the lead article, Wayne Holmes argues that education remains irreplaceable because learning is only one part of what education does. Drawing on Gert Biesta’s well-known distinction, he reminds us that education is also about socialization and subjectification: becoming part of a community, and becoming a person capable of judgement, responsibility and independent thought. This matters greatly for VET. Much vocational learning does not happen through the passive reception of information. It happens through participation in communities of practice, through feedback, imitation, discussion, error, reflection and the gradual development of professional judgement. A student may be able to use AI to generate a report on health and safety, customer care or electrical maintenance. That does not mean they have learned to act responsibly in a workshop, to make decisions under pressure, or to work well with others.
The Courier is also right to push back against the assumption that AI output is the same thing as knowledge. Holmes notes that generative AI can produce fluent and plausible responses while remaining fundamentally unreliable. It does not understand truth; it predicts likely sequences of language. This is not a minor technical weakness that will simply disappear with the next model update. It is a reminder that teachers and learners still need the capacity to question, verify and interpret. In European VET, where learners are often being prepared for safety-critical, socially consequential or legally regulated occupations, this point should not be underestimated. In such contexts, a polished answer is not enough. What matters is whether a learner can assess a situation, distinguish the credible from the false, and explain why a course of action is justified.
One of the strongest contributions in the issue, especially for a European audience, is the article on Sweden’s reassessment of the digital classroom. After years of pushing screens and one-device-per-student policies, Sweden is now rediscovering something that should never have been forgotten: technology does not improve education on its own. The article describes how digitalisation often prioritised hardware over pedagogy, and how evidence for improved attainment remained weak or even negative for some groups. The lesson here is not that schools and colleges should turn away from digital tools. It is that implementation matters more than novelty. Teacher development matters more than procurement. Curriculum clarity matters more than platforms. And support for students matters more than abstract claims about personalisation.
This is important for VET policy and practice across Europe. For many years, vocational systems have been encouraged to modernise through technology, often under pressure to appear innovative, flexible and labour-market relevant. There is, of course, nothing wrong with innovation. But the Swedish example is a warning against a familiar pattern in education reform: the belief that complex pedagogical and social problems can be solved by introducing devices, dashboards or AI assistants. In colleges, training centres and workplace learning settings, the important questions are more grounded. Does a tool help apprentices understand a task better? Does it strengthen or weaken attention? Does it support inclusion? Does it reduce unnecessary administrative work without narrowing teaching? Does it expand professional judgement, or slowly erode it?
The global perspective of the Courier is equally important. The article on African languages shows how quickly supposedly universal AI systems can fail when they encounter different linguistic and cultural realities. Tools trained largely on dominant languages and Western datasets do not simply produce uneven technical performance. They can marginalise local knowledge, distort meaning and weaken the connection between language, identity and learning. For Europe, this should be read as more than a story about somewhere else. European VET is also multilingual and culturally diverse. Many learners move between home language, workplace language and the language of formal instruction. If AI systems are introduced without attention to this complexity, they may reproduce existing inequalities rather than reduce them.
This wider lens also helps us think more clearly about what is at stake in current discussions of productivity. Much of the public conversation about AI in education is framed around speed: faster lesson preparation, faster feedback, faster content generation, faster access to information. These advantages are real, and many teachers are already making careful and intelligent use of such tools. But the Courier repeatedly returns to a more difficult question: what happens to learning when speed becomes the measure of value? Vocational education has good reason to resist that reduction. To become a skilled worker, technician, craftsperson, care professional or trainer is not simply to accumulate information efficiently. It is to form habits of attention, standards of quality, ethical sensibilities and ways of working with others. Those things take time.
The article on teachers in Argentina captures this tension well. It presents educators as neither naïve enthusiasts nor simple opponents of AI, but as professionals trying to respond to a technology that has already changed assessment and classroom practice. Some move towards oral assessment. Others redesign tasks to make room for reflection and discussion. Many want training, but remain concerned about what is being lost when students hand over too much of their intellectual work to machines. This feels close to the present reality in many European VET contexts. The question is no longer whether AI is available. The question is how to redesign pedagogy so that learners still need to show understanding, performance, judgement and responsibility.
Seen in this light, the new UNESCO Courier issue offers an important correction to the more breathless narratives around AI and education. It suggests that the key question is not whether AI can do some of the things students or teachers currently do. The key question is what educational institutions are for. If education is only about producing outputs, then AI will always seem like a compelling shortcut. But if education is also about becoming capable, critical and socially responsible human beings, then the task looks rather different.
For vocational education and training in Europe, that means we should approach AI neither with panic nor with surrender. We need a critical approach, one that recognises the usefulness of new tools while remaining clear about the purposes of teaching, training and research. AI may help with drafting, translation, simulation, administration and access to information. But it cannot replace the human relationships, situated judgement and shared practices through which vocational knowledge is formed. Nor can it answer the ethical and political questions about whose knowledge counts, which languages are supported, what kinds of work are valued, and what sort of future we are preparing learners to enter.
In that sense, the question posed by UNESCO has a straightforward answer. Yes, we still need to think. Perhaps more than ever. And in vocational education, where learning is inseparable from doing, judging and working with others, we also need spaces where learners can practise thinking together.
References
[2] Wayne Holmes, “Learning to think in the AI era,” The UNESCO Courier, 2 April 2026. <
[4] James Maisiri, “African languages, the blind spot of AI,” The UNESCO Courier, 2 April 2026.
[5] Natalia Páez, “Argentine teachers divided over AI,” The UNESCO Courier, 2 April 2026.
