Over the last decade or so the focus of the ed tech conferences I have attended has shifted increasingly away from the technology itself towards what we can do to transform education. In the early years it was as if ed tech enthusiasts were like magpies, dazzled by every shiny new tool. Some of that sense of wonder still exists, of course, and is healthy. We need to be alive to new possibilities as technology evolves. But over the years we have learned to become more discriminating as we found what tools actually worked in our classrooms, and learned not to try to do too much at one time. The focus started shifting towards pedagogy, towards how to use the tools effectively. Behind this was always some thought as to the significance of the impact of technology on education. Common refrains have been the development of 21st Century Skills, personalised learning, a movement away from teacher-centred to student-centred approaches, problem-based learning, what technologies will disrupt education and learning based on the burgeoning field of neuroscience. The overall sense has been one of promise, that technology has the potential to make teaching and learning more effective, and that education will become transformative in liberating humanity from a model grounded in the factory system and a mechanised reproduction of knowledge and skills.
This year’s conference was no different in content although the technologies have changed somewhat. The focus has shifted towards Artificial Intelligence, robotics and coding, especially how to involve women in STEM and how to infuse computational thinking across the curriculum. However, this is the first time the sense I have is not one of advocacy, but of militancy. Speakers from the world of work were united and adamant in a condemnation of schooling itself. A clear preference for extra-curricular learning and the futility of academic qualifications was presented stridently. Employers, we were told, prefer people able to solve problems. If any learning is required it can be delivered, just-in-time at the point of need, online via MOOCs. Tertiary qualifications should be modular and stackable, acquired over time when required to solve real world problems. Educators endorsed this stance stressing personalised learning and the use of Artificial Intelligence and even real-time feedback from brain activity. The sense was one of an urgent need for a curriculum based on problem solving rather than subject disciplines. If you need some Maths to solve a problem you can get it online. You don’t need to study Maths divorced from real world imperatives.
The very idea of tertiary institutions is clearly under massive assault, and it cannot be long before they come for secondary schools as well. What scares me about this is not that I don’t agree that learning should be problem-based at some level, or that degree programmes should not be using MOOCs and blended models to achieve greater modularity and be more student-driven. What scares me is what we lose by doing that. My fears are based on two premises.
Firstly, I believe that knowledge should be pursued for knowledge sake rather than for the needs of the world of work alone. Of course our education should prepare us for employment or entrepreneurship. To argue that it shouldn’t is folly. But knowledge has its own trajectory and logic. Mathematical knowledge, for example, represents a body of knowledge bounded by rules and procedures. It forms a coherent system which cannot be broken up into bite-sized chunks. Can one quickly study calculus without studying basic algebra just because you need calculus to solve a problem? Historical knowledge is not just about reading up on Ancient Sumeria on Wikipedia quickly. Historical knowledge is founded on a system of evidentiary inquiry within a narrative mode of explanation. I worry that just-in-time knowledge will lack a solid enough base. If we erode the autonomy of the universities and do away with academic research, what happens to knowledge? It will become shallow and facile.
Secondly, I believe that the discovery model of learning is deeply flawed. Of course, if left to our own devices, following our curiosity, we can discover much. It is a fundamental learning principle. But it is not very efficient. There is no earthly reason why teaching should be ditched. Being told something by someone else is as fundamental a learning principle as learning something for yourself. It is an effect of socialised learning. We learn from each other. Teaching is an ancient and noble profession, and there seems no reason to ditch it now. The scholar’s dilemma is that it is unusual to discover anything unless you know it is there, and this requires guides and mentors. The world we live in is complex and vast and we need a working knowledge of a great deal. Without extensive teaching, it is difficult to see how we could acquire the knowledge we need.
I would argue that we need a broad-based liberal education, focusing on critical thinking and problem solving, which gives us a grounding in Mathematics, the Sciences, the Arts and Humanities. At this stage, after a first degree, say, the best approach could well be just-in-time content delivery delivered online.
Just because technology can disrupt education doesn’t mean it should. Teachers have been very conservative in their adoption of new technologies, and I think this is a good thing. Education and knowledge are just too important to change willy nilly. We need to be certain that we are not destroying our evolutionary advantage, our ability to think, simply because we can.