There’s no such thing as self directed learning

Graham and I were discussing what personalised learning actually means in terms of the promises made by ed tech companies to use AI to offer differentiated and personalised learning. Graham introduced me to the idea of ants following predefined pathways as a metaphor for learners following AI predictions of what to study next. The more ants following the same pathway, the more that pathway becomes used, the more likely the AI is to suggest it. The pathway is never truly personalised, just branching options of a choose your own adventure story where the endings are pre-defined.
I had always assumed that the key to personalising learning was for it to be self directed. I identify a need (in September I learned how to make pivot-tables in excel) I set myself a goal, and I find relevant training and support materials (books, youtube, friends) until I can do the thing by myself. This requires a good level of metacognitive skills; planning, managing, critically evaluating the learning process. There’s also my intrinsic motivation to learn, identify my goal, source and synthesise the information and work out whether or not I have achieved it. I make conscious decisions and I adapt my strategy based on the outcomes. If I have self direction, I have informed agency.
An online learning platform with adaptive algorithms, presents a complex paradox within this framework. On the surface, the systems should empower the learner by offering personalised pathways, recommend content based on performance, and allow for flexibility in pace and timing. This can indeed support elements of self-management. But does this algorithmic guidance foster genuine self-direction or is it just an illusion of choice within a pre-determined architecture? The learner chooses what and when when to click, but the underlying curriculum, the sequence of concepts, and the very definition of ‘progress’ are dictated by the platform design and data points, narrowing epistemic horizons and substituting the learner’s own curiosity with the system’s predictions.
Can self directed learning happen in an “AI driven learning environment”? I suppose it depends on the learner. If the platform is used as a tool by a critically-engaged learner who sets their own external goals, interrogates the algorithm’s recommendations, and supplements its content with diverse sources, then it can serve as a powerful component of a self-directed strategy. However, if the learner passively accepts the algorithmic pathway as a definitive syllabus, the capacity for critical autonomy deteriorates.
As teachers and trainers, we must cultivate learners who can use technologies mindfully. This involves fostering digital literacies that enable learners to detect, understand, question, and direct the algorithmic influences on their own learning, ensuring that the ‘self’ remains firmly in command of the direction.
About the Image
Distorted Fish School by Lone Thomasky & Bits&Bäume
Lone Thomasky & Bits&Bäume / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/
The image reveals the intentional, coordinated social lives of animals which are often invisible to humans. Because this richness is sometimes hidden beneath the surface, the oceans can be disrupted by noise pollution, mineral and rare element mining and contamination caused by AI development. The distortion aims to visually represent the impact of destructive AI developments that are often too easily justified by human ignorance and technological "progress".
