The recent shenanigans surrounding Cambridge Analytica and Facebook reveals reasons why we should be very wary of Big Data in education. The argument is often advanced that computerization of the classroom will allow for the collection of large amounts of data on a student’s progress and for increased personalization and more effective pedagogical approaches to be adopted. Teachers are limited and when asked to teach large classes especially, are often unable to give the kind of individual attention we would like. This idea harks back to the teaching machines beloved of Behaviourist psychology and the dream that programmed learning paths could be built into instructional design in such a way as to deliver the right content at the right time for each individual, making learning much more efficient. I have two problems with this notion. Firstly it ignores the crucial understanding of learning as a social construct, reducing it to a solitary interaction between student and teacher (machine). And secondly it dovetails so neatly with the great push for Taylorist efficiency and the erosion of privacy as to raise alarm bells around our civil liberties. If they can gather so much data about us when we are young and in school, how on earth will they use it later when a student has graduated? Will that data be destroyed or sold on for profit? Will the data belong to the student, the school or the educational publishers producing the software?
At the risk of sounding like a Conspiracy Theorist, I do believe that it is incumbant on us as teachers to do everything in our power to protect the data of our students, especially such sensitive data as intimate knowledge of learning patterns and behaviours! If I know how you learn, I have great insight into how to control your behaviour, what shoes you will buy, or how you will vote!
As important as this point is, I do not want to dwell on it. Learning is not individual, It is social, as Vygotsky pointed out. We learn first socially and then internalize that knowledge individually. The distance between the two, Vygotsky termed the Proximal Zone of Development. We need more experienced others to show us not only how to do things or to pass on knowledge, but also to show us what is knowable. What we believe it is desirable to know is also socially constructed. I learn to do things first with the help, guidance and instruction of others, and then, after a while, am able to do it myself. Can machines fulfil the role of the more experienced other? In some ways, yes. Pressey’s testing machines from the 1920s or Skinner’s teaching machines from the 1950s demonstrated that programmed learning could be used with some degree of success. However, these machines, and the computer programs that replaced them have not been dubbed drill and kill for nothing! While there is some research evidence that they were successful for weaker students, their interface and relentless diet of machine delivered question and answer killed all motivation and they lost favour as the fortunes of Behaviourism waned.
As Constructivist learning theories gained traction, learning machines were ditched in favour of new theories about how machines could be used in the classroom. Seymour Papert’s influential Constructionism and approaches such as Apple Classrooms of Tomorrow came into vogue. Computers were to be used by students to author content and as tools for active learning. But beyond this, with the advent of the Internet, computers came to be seen as above all else tools for communication and collaboration, well suited for affording contact between students. Google docs, with its capacity to allow multiple users to author a single document simultaneously unlocked the power of collaboration. Skype could bring other students from across the globe into a classroom, or allow videos to be exchanged across continents. These are hugely engaging uses, and if used properly, can have enormous educational benefits. But they depend on being almost invisible. When you are collaborating in a Google hangout or a Google doc you are not concerned about the technology, you are engaging with other people’s minds! Learning is social, meaning we learn by, with and from others.
The notion of the computer as a device that could track student progress and provide just the right input and feedback at just the right time never quite went away, however, and the growing capacity of computers to do this has led to a resurgence in the belief in personalized teaching machines. Many platforms allow student progress to be tracked and content unlocked depending on progress. Khan Academy, for example has such an interface, and programs such as MyMaths allow teachers to track progress on a dashboard. While this may seem innocuous and indeed beneficial, the drill and kill effect is often cited by students who resist, or try to subvert such programs when they are used in the classroom. These programs are sold in the name of personalization and with a Big Data tagline. The technology may improve, but at the moment these uses of technology are viewed by students as boring and alienating.
And if the technology improves, the Conspiracy Theorist in me starts to be afraid, really afraid!