RSS

Category Archives: Learning Theories

Teaching in Masks!

As schools in South Africa begin a phased re-opening, those of us who have been teaching remotely for the last two months, will need to get used to teaching some of our students on campus, and some remotely. South Africa has decided to send our matriculation and grade 7 students back first on 1 June, with other grades following in a staggered manner. But the situation is complicated by the fact that some students may elect to stay at home rather than come in to school, some may be ill and are advised to stay at home, and some may be fragile and attend school intermittently. To be frank, as teachers we do not know what to expect. How many of our students in any class will be on campus? How many at home? But we must be prepared to teach them one way or another.

When schools closed as the lockdown was announced I think we had a fairly good idea as EdTech champions as to how to help teachers prepare for teaching remotely. We were able to train those staff who needed help using the school Learning Management System. Were able to suggest software options for recording lessons, adding whiteboards, setting up online assignments, launching meetings, and so on. I do not feel as confident in any advice we can give for teaching half your class face to face, and simultaneously half of it online! This is completely uncharted territory!

Some teachers have explored Flipped Classroom models in which students watch or read instructional materials at home and then do worked activities in the classroom with the support of the teacher. This flips the traditional model where the teacher introduces concepts at school, and students do exercises which explore and consolidate the concepts at home. It seems to me that the only viable way of teaching simultaneously face-to-face and online needs to take this model as a starting point. If a classroom has an interactive whiteboard, the teacher can use the IWB and display their Learning Managment System, be it Google Classroom, Teams or Moodle on the board so that it can be seen by students in the classroom and by students at home. The teacher can then help both students in the classroom and those at home complete whatever tasks have been set. If the teacher themselves has to be at home, they can broadcast to the classroom in the same way, with a substitute teacher on site to manage the classroom. Having a web camera installed on your IWB to capture the classroom would help here as well.

This forms a very general infrastructure which could allow for a variety of pedagogical approaches to be explored by teachers. Teachers are used to adapting to changing circumstances, and will find ways of making it work. In larger departments it might be possible for teachers to team teach, one on site and one at home. Likewise I believe that it would be beneficial to use students’ personal devices in the classroom to pair up students on site and those at home to help work through activities that combine classroom and home-based activities. For example one student on site and one remote could discuss a text, or work on a shared Google doc, communicating via the LMS chat or apps such as whatsapp.

I do not think any of this will be easy, and will be open to all kinds of technological glitches, but I do believe that we will find ways of working that not only make the best of a bad situation, but also open up ways of working that will add tools to our armoury as teachers that we can use once things return to normal.

If they ever do.

 

Does The Great Onlining Offer Opportunities to move from Teaching Content to Teaching Thinking?

One often hears the view expressed that one of the benefits of the enforced move to teaching online is that it will entail a move away from teaching content, and open up opportunities for a new vision of teaching that foregrounds students’ problem-solving skills. One of the many educational trends that have been rained down on teachers like the ten plaques of Egypt, is the idea that content is outdated, and that what counts in the Twenty First Century is Problem-Solving or Thinking Skills. It is an idea that has become all pervasive. At every Educational Technology Conference I’ve ever attended, at some stage a keynote speaker will express this point of view. Especially if they come from industry. “What we need is not people with paper qualifications,” they say, “it is people who can think and problem-solve!”

But can thinking be distilled from all context and taught as something discrete? Knowledge is changing so fast, the argument goes, that it will become outdated as soon as you teach it, and therefore what we need to be doing is teaching students to think, rather than teaching them content. This idea is seductive because of course it appeals to a kernel of truth. Knowledge is changing really fast. What I learned about the structure of the atom in high school is certainly not what is taught today! And yet the notion that somehow education’s core business has suddenly changed is somewhat ludicrous. Did teachers not teach students how to think pre-millenium? What does thinking that is separated from content look like, anyway?

My own career as a teacher has been affected by this movement towards explicit teaching of thinking. I teach a class called Thinking Skills. In this class we use problem-based approaches together with introducing the Harvard Visible Thinking Routines and cognitive tools such as the De Bono Thinking Hats, David Hyerle’s Thinking Maps and Costa and Kallick’s Habits of Mind. These thinking tools and strategies are embedded in every school subject, but the purpose of the Thinking Skills class we do in our grades 8-10 is to give importance to thinking itself, and to provide a platform for explicit teaching of the range of cognitive tools we use across the school. I am in two minds about how effective this is as an approach. Thinking, after all, is always about something. Thinking divorced of content makes no sense, and thinking always has a context. How you think as an historian, a musician or as a scientist is different. Learning to think in one context surely confers benefit, and surely fuels habits and dispositions which are transferable to other contexts. But how this happens is not easy to pin down, or easy to demonstrate. Nor is it automatic. We assume that it happens, but we cannot definitively demonstrate that it does. We hope that an awareness of different cognitive tools, and familiarity with using different thinking strategies will improve our students’ thinking skills. We try to teach them to notice when they need to reach into their cognitive tool kits, and develop their capacity to reflect on their own thinking, and to become better at choosing appropriate cognitive strategies. But all the documentation in the world does not add up to proof that this is effective. And as much as I think the Thinking class I teach is useful, I do not believe it supplants Maths or English classes in any way. Students still need to learn to think like a mathematician, or think like an artist!

There is some anecdotal evidence of course, that our approach to cognitive education does work. Visitors to the school express amazement at how well our students engage with problem-solving tasks. As encouraging as this feedback is, it does not amount to proof. The benefit of an explicit Thinking course is not really about improving performance in other subjects, the aim is to improve the ability to think in any context. I think what students enjoy about it is that they get to think about real-world problems without the pressure of assessment or swotting. I think it is also important in that it signals that what the school values is thinking, and the development of thinking dispositions. I believe that this approach has benefits because solving problems helps improve the ability to solve problems. Not least it builds confidence in the ability to solve problems. As anyone who has ever tackled problems like crossword puzzles, for example, will know, once you start to understand how the puzzles are set, and develop strategies for solving the clues, the easier it becomes to work through the clues. And even a difficult seven across will be tackled with a level of confidence that it can be solved given enough time. The ability to solve crosswords does not necessarily make one a better problem-solver in another context, such as Chess problems. One can be quite good at solving one type of problem, but quite bad at another. In our class we try to tackle different types of problems and help students develop strategies and tools for approaching problems. The hope is that each student will develop a sizable toolkit of cognitive tools, and an awareness of which tools are good in different situations.

So, whilst I believe that teaching Thinking has value, I do not believe it can be done divorced from the curriculum. At my school the explicit teaching of thinking is limited, we wish it to be embedded in our curriculum, rather than becoming the focus of the curriculum. It would be lovely to believe that the move online would allow teachers to throw off the yoke of curricula and standardized testing and teach students to think, to problem-solve. Sadly I do not think that it does. It is rather naive, to believe that students, simply by doing an online project rather than more formal classes, will develop thinking skills miraculously. Thinking skills need to be carefully scaffolded and nurtured. Even in a Thinking class tasks are contextualised and we seek to draw students’ attention to opportunities for transferring their skills across the curriculum. As any teacher who has ever set an open-ended project will know, the success of the project depends on how carefully it was scaffolded and supported. Remote learning will not suddenly unlock hidden abilities in our students. If we want those abilities to emerge we need to put in the pedagogical work to develop them. And remote teaching is hard, it is hard enough teaching the regular curriculum.

Doing the kind of work needed to foster advanced thinking skills over Zoom?

I don’t think so.

 

Why Online Teaching is so Taxing!

Teachers who have been doing remote teaching over the last month or so report complete exhaustion. Not just because they needed to take time to re-design their curricula for remote platforms. Not just because they needed to record videos or re-purpose learning resources for an online platform. Not just because online assessment is a nightmare. But chiefly because of the exhaustion involved in conducting online lessons. And all this at a time when many teachers have to look after their own children and families, when they themselves are experiencing all the stress that we are all going through at this time.

So, why is online teaching itself so taxing? In the classroom you see your class for a set period of time, and you do what you can during the time you have with the class. Your energy goes into being present for your students, either in how you present content, or how you guide and shape their understanding of that content. You have to read the faces in front of you, notice who is beginning to goof off, who wants to ask a question, but needs encouragement, who has a puzzled look on their face, or who is clearly engaged in something else and needs re-directing. From the nods of understanding, or the expressions of sudden realization, you know when you can go on, or whether you need to try explaining something in a different way. You can judge whether or how long to wait after asking a question, or whether to rephrase it better. Face to face interactions require a great deal of work, and it can be exhausting in itself. Teachers suffer a great deal of cognitive overload. You have to keep not only the content you are teaching in mind, but also all the questions around how best to teach it. It is exhausting! But bells ring, and the day has an end. As exhausting as ordinary teaching may be, the week ends, and eventually the term ends. I have always thought that the length of a term is designed precisely to wring the most work out of students and teachers without completely destroying them in the process.

But online work demands a different level of presence. To be digitally present is to be available long beyond any scheduled lesson, worrying about those who never showed up to any online check-in, or who have missed a submission deadline. Teachers online don’t receive absence notes from parents explaining that a student is down with something, or will be away for a few days, but will catch up the work. Often all we have online is a silence that begins to prey on the mind. As emails expressing concern over a student missing in action go unanswered and days become weeks, the mind begins to invent all sorts of explanations, fears of all sorts and grieving for lost time and contact. Teachers are concerned about reaching all their students. But during a lock-down, if emails go unanswered, this concern can become all consuming!

Furthermore, students check-in at all hours of the day and night, with queries and concerns. I had one student ask a question at 2 am in the morning. There is far less of a switch-off point. If you are expecting students to work asynchronously, you more or less have to expect to maintain an asynchronous digital presence yourself. You may have announced that you will be keeping office hours, but if students have been missing in action, when they do pop up at an ungodly hour, it is hard not to respond immediately.

When you are teaching synchronously, reading the room is not easy, either. All the usual cues are largely missing. Facial expression and body language are harder to read, and a great deal more effort needs to go into understanding who wants, or needs to speak. Even managing conversations is more difficult with the false starts and technical glitches that bedevil meetings on Zoom, or whatever platform you are using. As someone who finds it difficult enough to read social cues under normal circumstances, learning how to do it all over again online is a nightmare!

All of these things make teaching remotely particularly taxing intellectually and emotionally. The cognitive load is much higher than in face to face teaching. It seems to me that the only way to cope with this added stress is firstly to recognise it, and secondly to begin to re-align our expectations and curriculum planning to accommodate this new reality. Most syllabi stress a relentless loading of content, breadth not depth has always been the name of the game. School administrators, districts and examination boards need to reassure teachers that the same coverage of content will not be expected during this period. Educational aims can still be met, but expectations around curriculum content needs revision. Is there really a need to study 18 set poems, perhaps covering 12 meets the same aims!? Perhaps one major piece of writing can be assessed rather than three? Perhaps some units of study can be left out, and more time spent on the remaining units?

We all need to go easier on ourselves, or teachers will be facing major burnout by the end of the school year!

 

 

The Great Onlining – From Digital Natives to Digital Aliens – Reflections after Week Two!

After two weeks of remote teaching, I have to say that mental exhaustion is starting to set in. I can only imagine how challenging it is for students as well. In last week’s blog I highlighted the problem of reaching students online who might not be able to be reached, or might not want to be reached. Technological problems aside, the very constraints of online platforms may make it more difficult for students to focus, find relevant instructions and resources or manage their time effectively enough to be able to complete much work.

Marc Prensky popularised the idea of the Digital Native, one who appears to have the natural, in-born disposition for digital applications. Prensky defined this as a set of dispositions stemming from age alone. Anyone born after a certain date was somehow imbued with technology in their bloodstream, so to speak. The rest of us, born before this date were digital immigrants, we would have to learn how to use technology through pain and sweat. This idea has been thoroughly debunked. Anyone who has ever taught children ICTs will attest to this. Children are not born with the habits, behaviours and dispositions neatly in place to make them natural born users of technology. And many older people take to technology like a duck to water. Nevertheless the concept of digital nativity, of dispositions, a gaze which predisposes the person towards digital use does seem to hold some merit. We all know people who seem to get it naturally, and others who will probably never cope with anything digital. Perhaps digital nativity is an acquired, cultivated or trained gaze – a way of looking at things which makes some people better at dealing with the new technologies than others. This disposition is not dependent upon age, but describes a spectrum from digital nativity to digital alienation.

When teaching online this becomes absolutely crucial because the medium of delivery is so dependent upon the technology. In my experience with hybrid classrooms, any class follows a law of thirds, although the quantification of that fraction changes from year to year, class to class and lesson to lesson. Students have different digital dispositions. One third I shall call the Digital Natives with apologies to Marc Prensky. This group is quite capable of working independently online. They can find and follow instructions, manage the resources left by the teacher and manage to ask questions where needed to complete tasks totally online. They don’t really need a teacher to tell them what to do, they have a capacity and disposition for discovery and an ability to figure things out quite quickly on a digital platform. This group tends to submit assignments without prompting on time, often well before the due date.

A second third, the Digital Immigrants need instructions to be in-the-flesh, so to speak. They struggle to locate resources or instructions online, but can cope with whole class instructions. If a teacher tells them what to do, and where to look, they can then work on their own. This group needs someone to foreground what they need to notice. But once this is done, they are happy to work on the task, although they do ask more questions, and need more scaffolding generally. A quick online check-in meeting may be all they need to get working.

A third group, the Digital Aliens struggle online, but also need any instructions given to the whole class to be repeated individually. Something said to the group only seems to be processed effectively when repeated once they are ready to process the information. This group may not respond well to instructions given in a group check-in meeting for example. They need to be taken aside individually and carefully guided through every single step. This is extremely difficult on an online platform. You really need a one-on-one meeting. This can be done in class more easily whilst circulating, but for a student struggling with the technology anyway, setting up an individual tutoring session can be well nigh impossible.

If this perception is correct, it has important implications for remote (and online) instructional design. It suggests that students from each of these groups really needs different strategies. In a face-to-face classroom teachers are able to manage these differences much more seamlessly, although it is never easy. Online, differentiating teaching is much more difficult. In the last two weeks I think I have started to get the hang of managing the Digital Natives and Immigrants. By posting instructional videos online ahead of a class the Digital Natives have a head start. Then I have check-in meetings at scheduled times where I can answer questions, share my screen and show students how to do things. I record these as well as some students seem to need the question and answer to make sense of it all. What is extremely difficult is trying to reach the Digital Aliens, most of whom do not check-in during scheduled times, or probably even watch the videos. Often reaching this group involves long tortuous emails in which I try to make sense of the difficulties they are experiencing and coax them onto the platform.

Sometimes this results in a eureka moment, but often it results in radio silence. I have sent out a number of emails in the last week which basically said something like, send me what you’ve got so I can have a look. Many of tehse remain unanswered, but I live in hope that week three will bring my break-through moment with the Digital Aliens!

 

Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Thinking Digitally Logo

What makes for good thinking? It is commonly assumed that it is our intelligence or abilities that have the greatest effect on our thinking. Ron Ritchhart (2002), however, makes the case that it is our dispositions, our acquired patterns of behaviour which activate and direct our abilities . More than our abilities themselves, our dispositions afford our capacity for greatness. These dispositions are shaped and formed by our interactions with others, by our culture, society and history (Ritchhart & Perkins, 1997). Dispositions, or habits of mind (Dewey, 2007) are pivotal because they determine when we choose to think, the importance we place on it, and how deeply we question the assumptions and narratives (Bruner, 1991) upon which thought is based, allowing new narratives to emerge.

For example, the habit of mind persisting suggests a disposition towards keeping at it: trying new ways to approach a problem, not being daunted by failure, worrying away at something until you succeed. This disposition applies in many different contexts, including digital spheres, such as computer programming. It is clearly a disposition which affords success. You cannot succeed if you give up too easily!

homLikewise the capacity for finding humour affords success in many different contexts. Costa & Kallick (2008) have identified sixteen such habits of successful thinkers. These habits do not describe the knowledge or skills necessary for success, but the dispositions which enable successful thought and action in the world.

In this article I will be arguing that Thinking Digitally is just such a disposition; a habit of mind which predisposes those who demonstrate it towards success in the same way that persistence or finding humour does. It is a disposition towards using digital tools to aid thought and action in the world; towards deploying algorithmic thinking as a way of generating explanations of reality, and using computational thinking as an approach to problem solving. These ideas will be unpacked below, but it is important to note that they share a concern with the disposition towards using digital devices to create, solve problems and understand the world we live in. I have chosen to use the term Thinking Digitally rather than computational thinking, which is the more commonly used term, because I believe that it describes a wider range of uses of digital devices, beyond how we organise tasks that are going to be automated by machines, mathematical modelling, the use of simulations, and simulators, or calculators and computers to include the ways in which we access and evaluate information, and how we author digital texts and communicate our understandings in the current era.

I will argue that Thinking Digitally is a logically separate habit or disposition rather than the application of any combination of the other sixteen habits in a new context, the digital world. Marc Prensky (2001b) has advanced the idea that people born after a certain date, whom he has dubbed digital natives, carry a natural disposition towards digital technologies. He argues that people born before the advent of the digital era are digital immigrants; that digital technologies are somehow foreign to them, and that while digital natives are able to use digital devices effortlessly, for immigrants there is no such natural facility. In other words digital dispositions are based on age. While this idea has gained widespread traction because it seems to explain why some people seem to take to digital technologies like a duck to water, and others struggle with it, or avoid it entirely, it is a notion that has been roundly critiqued, as we shall see below. I shall argue for the notion of a set of digital dispositions, based on certain characteristics or habits of mind rather than on age, carries weight.

In recent years, within the Computer Science community, teachers have begun to argue that Computational Thinking, often called Algorithmic Thinking should be integrated into the curriculum, not just as a separate discipline, but as a foundational subject. I will argue that Computational Thinking and Algorithmic Thinking forms a new mode of thought, different from the paradigmatic or narrative modes of thought, and that the predisposition towards using this mode of thought forms the basis for arguing in favour of a separate habit of mind. I will argue that a disposition towards using computational and algorithmic thought, and allied to this using digital media to access and legitimate knowledge, forms part of an evolving revolution in thought, which is transforming the world, and towards which some people appear to be more disposed, and others less disposed. Acquiring the habit of digital thinking clearly advantages some, and leaves others floundering in an increasingly digital world.

If we look at recent ideas around epistemology, theories about learning and cognition, pedagogy and literacy, we can begin to discern polar opposites which, while they do not define neat categorizations of analogue and digital thought, do point to ways in which certain polarities are afforded by either analogue or digital modalities. We are living in an Information Age (Castells, 2011), in which the digital storage and processing of information in digital forms is increasingly central to our knowledge and economic survival, and this privileges certain ways of looking at and thinking about the world. Our ability to succeed in this altered landscape is largely dependent upon our dispositions and habits of mind, not just on our knowledge or skills. Some of these dispositions have already been described by Costa and Kallick, but some are unique to digital contexts, and revolve around how we think about our relationship with digital cognitive technologies.

Khaneman (2011) has argued that there are two types of thinking which operate. System I thought is fast, intuitive thinking in which we come to rapid assessments based on our accumulated habits of thought, and received judgements handed down to us. System II thought, on the other hand, is slow thinking in which we carefully reason through a position, considering the evidence and reaching a judgement based on a logical process. We need to rely on System I thought most of the time. We cannot tell life to wait while we carefully reason over a position. System II thought is, however, vital because if allows us to build better instincts, and base our intuitive responses on more solid ground. In other words, it frames our habits and dispositions. Successful critical thinking is based on building solid habits of mind on which we can depend when we don’t have the time to carefully consider our responses.

In much the same way, I want to make the case that the salient feature of our use of digital technologies lies not in the skills and abilities we bring to any given task, but in the dispositions and habits of mind that unlock what we are able to accomplish with technology. Behind digital literacy lies, in other words, a set of mental habits and dispositions which are what allow the power of the technology to be unleashed. My capacity to use a database is one thing, but it is my digital dispositions which determine when I choose to use a database, for what purposes, and to what effect. Their dispositions are what allow some people to use technology to unlock their creativity and solve problems rapidly and effectively, and what constrains others to use it merely for repetitive or largely meaningless tasks, or not to use it at all.

What are the dispositions and habits of mind of successful people when they are using digital devices and media? Do these dispositions form a logically discrete entity or are digital technologies simply a different context in which the habits and dispositions (Costa & Kallick, 2008) that apply to analogue thinking are applied?

In working through this argument I am going to make a number of moves which are intended to present the case that there are indeed a set of uniquely digital dispositions, and that it makes sense to speak about thinking digitally as an overarching approach that successful thinkers bring to their use of technology, which shapes and guides effective digital practice. Indeed I will argue that these digital dispositions may also be brought to analogue contexts. In other words, we human beings, in learning how to think effectively when using digital tools, may have acquired some dispositions which we can carry back into the real, analogue world.

Computational Thinking

computational-thinking-white-bg

The first move is to argue that one set of digital dispositions hinges around the thinking that we do when we interface with machines. A frequently used term is Computational Thinking (Wing, 2006), which is used to describe the “thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information processing agent” (Lee, 2016). It is essentially taken to mean, then, the thought processes of one who is using a machine to solve a problem, and it is normally constituted as being based on three cognitive pillars: abstraction, automation and analysis. When we use cognitive technologies to automate or perform tasks, we need to program the machine, we need to break the task down into sub-routines that can then be automated, and we need to think the task through, understanding it from the point of view of what it is that the machine can and can’t do. The illustration above (“Reading A Book or An Hour of Code?,” 2016) is typical of attempts to break down what constitutes computational thought.

Proponents of Computational Thinking in the curriculum argue that “algorithmic thinking” should form the fourth R of the curriculum: Reading, ‘Riting, ‘Rithmatic & ‘Rithm (Grover & Pea, 2013). The two terms appear to be used interchangeably. What then is Algorithmic Thinking?

Algorithmic Thinking vs Narrative & Paradigmatic Thought

paradigmatic narrative

Bruner (1991) sets out a dichotomy between two modalities of thought: the narrative and the paradigmatic. The paradigmatic mode, the scientific mode, seeks to logically categorize the world and test hypotheses of what is. The narrative mode, on the other hand, is concerned with the meaning that is ascribed to lived experience. When explaining the world, why an apple falls to the ground, for example, one could use paradigmatic thinking, appealing to a concept, the theory of gravity. But one could also use the narrative mode to explain why an apple falls to the ground: the apple falls because I knocked it with my elbow, and it rolled off the table and fell to the ground. There is nothing necessarily superior in either explanation. Both have explanatory power.

The advent of complex calculating machines, however, has furnished us with new ways to go about constructing an explanation of the world. Computer modelling offers ways of constructing a model of how tables, elbows and apples might behave under different sets of circumstances. Using a computer, and inputting slightly different sets of data and observing the outcomes can generate a more complete understanding of how apples roll and fall and allow us to make more accurate predictions. The sheer number of calculations necessary to sustain this mode of thought was largely impossible before the advent of computers, but it allows us to predict weather, and find distant planets.

paradigmatic narrative algorithmic

Those who have advanced the idea of an algorithmic mode of thought would be hard pressed as to where to place this in Bruner’s schema. Algorithmic thinking is not necessarily about categorizing the world, or logically predicting outcomes, nor is it about narrating an explanation, although it may involve both these things. It shares with paradigmatic thought a concern with establishing procedures, but shares with narrative thought a concern with sequencing. It should probably then been seen as a separate mode of thought. In the diagram on the right, I have represented this by placing it off to one side.

But, as we noted, there are considerable overlaps, so it would probably be better to represent this schema as a series of overlapping circles. This schema has the advantage of allowing for greater nuances. Bruner himself, for example shows how narrative thought lies behind much of scientific explanation.

paradigmatic narrative algorithmic 2

Behind the algorithmic mode of thought lies mathematical modelling of the world and the methodology of simulation. Mathematical modelling, enabled by the power of computers represents powerful new ways of acquiring knowledge of the world. Evolutionary Engineering, for example seeks to use comparisons between random designs to reach more efficient designs in the fewest number of steps. Rather than seek to design complex solutions from principle (paradigmatic view), by comparing one design to another, and selecting the better one, repeating this process, optimum designs can be achieved in relatively shorter periods of time.

Cognitive Offloading & Cognitive Technologies

Because we have such small working memories (Sweller, 1988) we constantly need to offload our cognition onto the environment. We use language and writing to achieve most of this, but we also use our fingers when counting, or devices such as calculators, calendars, books, and increasingly these days digital devices such as cell phones, computers or the Internet. Dror (2007) argues that digital tools have become so pervasive and integral to our thinking processes that they cease to be external aids and become internal components in our cognitive processes. Cognitive technologies, in other words, have become instrumental in shaping our cognitive processes. New forms of distributed cognition (Clark & Chalmers, 2010) have arisen, in which our thinking is distributed across machines and across other people (Dror & Harnad, 2008), radically altering our cognitive landscapes.

These new theories about cognition and knowledge, encompassed within what is known as the Second Cognitive Revolution in psychology (Miller, 2003) clearly represent radical new ways of looking at what it means to be human, and how we see the relationship between man and machine.

Computational Thinking & Digital Thinking

Jeanette Wing, who introduced the term Computational Thinking, described it as “a universally applicable attitude and skill set everyone, not just computer scientists, would be eager to learn and use” (Wing, 2006). Nevertheless, and despite a growing receptivity to the idea that Computer Science should be considered a core subject in the curriculum (Grover & Pea, 2013), and despite the growing integration of ICTs in our lives, and educational system, Computational Thinking is still seen largely as part of the Computer Science curriculum rather than a mode, or disposition of thought which pervades all subject disciplines. Lee argues that there is a tendency to conflate Computational Thinking with the operational steps involved, with computer literacy or with information literacy (Lee, 2016). This misses the point that Computational Thinking is in essence the dispositions of thought that human beings use when they are using machines to automate thinking process, or doing cognitive offloading onto machines.

I think part of the difficulty lies in the use of the term computational. As Wing stresses, the aim of Computational Thinking is not to get human beings to think like computers, but to describe how human beings think when they are using machines to solve problems. The term computational is so firmly rooted in the mathematical and engineering sciences that it tends to preclude the use of computers for perhaps their most vital task, that of communicating and accessing knowledge. How knowledge is legitimated and disseminated lies at the centre of our cognitive identities, and the new digital technologies are in the process of transforming the rules of the game. I believe that it is more useful to speak of how human beings organise their thought processes around problem solving using machines as digital thinking rather than computational thinking. This semantic change allows us to conceptualise computers as communication as well as computational devices and to see the dispositions of thought as including both purposes.

Digital technologies have transformed our practices in any number of ways, but it is important to note that every shift in practice represents a mental shift as well. It changes how we approach problem solving in subtle but important ways.

Digital Thinking vs Analogue Thinking

The second move is to claim that human thought is undergoing something of a revolution, and that particular aspects of our cognition are being accentuated and transformed by the rapid advances in digital media and what this affords. These changes are sufficiently momentous, and the contours of such change sufficiently discernible that we can begin to talk about a difference between digital and analogue thinking. These differences are not absolute; we have not acquired new brains overnight, but they are polarities which are impacting on our cognitive lives. I believe that those aspects of our thought which are afforded by digital technologies have been growing in importance, in line with the development of the information economy and the new world order, which has altered our discursive landscape (Gee & Lankshear, 2006) and may well be altering our brains as well. Neuroplasticity ensures that changes in practice have an impact on our brains (Prensky, 2001b), and it must be assumed, our cognition also.

analogue v digital

That we are experiencing change is not in contention, but how to characterise that change is. There are perhaps two possible views on this. The first, what we might term the hard view, is that Digital Thinking represents a new mode of thought afforded by the new digital media. This modality of thought has significant points of difference to Analogue Thinking to warrant identifying a range of habits and dispositions of mind which signal successful practice. This view sees cognition as being revolutionised. Stevan Harnad (1991), for example, talks of the Fourth Cognitive Revolution, as described below, and this has obvious ramifications for our behaviour and habits. The soft view, on the other hand, sees cognition as being largely unaltered by digital media, but that new approaches and dispositions are needed for dealing with the differences inherent in using digital media. The illustration above attempts to characterise two polarities of analogue and digital thinking as cutting across the different modalities of thought we discussed previously (Paradigmatic, Narrative and Algorithmic thought).

I do not believe that we can assess either of these views, whether cognition itself is changing, or whether our habits of mind are changing does not essentially alter the schema which can be discerned emerging from the literature around digital literacies, cognitive psychology and computational thinking. In either event it makes sense to differentiate the contours of how we perceive the terrain as altered by the introduction of digital technologies.

In an always-on, always-connected world, as more and more devices become connected in the Internet of things, and as connectivity becomes more commonplace and more integral to our lives, many are asking questions about what this means for our cognitive development. Headlines scream that the Internet is making us dumb (Carr, 2011; Thompson, 2013) and neuroscientsists suggest that our brains are being re-wired (Siemens, 2014; Greenfield, 2014) for good or ill. Psychologists suggest that it is affecting our memories (Sparrow, Liu, & Wegner, 2011) or that it is affecting our social lives (Wooley, 2013). There is a sense in which we stand at a crossroads, uncertain of both the direction and meaning of the rapid changes which are shaping our society. Utopian dreams and dystopian fears permeate the debate around the extent to which technology is driving change. Advances in Artificial Intelligence raise the prospect that we will soon not be the only sentient beings on the planet, and usher in a Brave New World where the difference between Human and Machine may not be that clear cut.

I would argue that never before have we needed a roadmap for navigating the future as much as we do now. Moore’s Law (Gibbs, 2015) which states that the capacity for data storage will double every 18 months, while applying to data, is a metaphor for the rapidity of change, and expresses the sense in which the rate of change is exponential. I would argue that what is required is a new set of habits and dispositions, strategies for dealing with this change, and with the changes associated with the Internet revolution. Our old ways of thinking, the habits and dispositions of centuries may not be enough to guide us in the future.

The advent of the printing press saw the introduction of a medium of mass communication which has had far-reaching implications for our history, society, and thought. When books were only available to a select few, and knowledge dependent upon individual transmission, knowledge was hierarchical and legitimated by an appeal to authority alone. In the thirteenth century Peter Abelard initiated a revolution in knowledge (Campbell, 1991), an appeal to evidentiary support in challenge to authority, where even theology had to bow before reason, but it was only with the invention of the printing press, and the Industrial Revolution allowing for the mass production of books that knowledge could be distributed more widely, ushering in the ideological sea change represented by the Enlightenment. How knowledge is legitimated, disseminated and reproduced changes completely how individuals think, learn, create and solve problems.

I would argue that we are at a similar turning point represented by the new digital media. The ability to self-publish, afforded by the Internet, has changed how knowledge is disseminated, altering legitimation practices. Many academics publish material from their blogs, and the speed of publication ensures peer review, but not in the conventional sense. An example of the future is wikipedia, that repository of crowdsourced knowledge said to be as accurate as Encyclopedia Britannica. Stevan Harnad’s notion of academic “sky-writing” encapsulates the idea of the Internet affording the immediacy of oracy with the reflective power of literacy to bring about the Fourth Cognitive Revolution. Harnad’s (1991) schema is tabulated below.

First Cognitive Revolution Oracy Hundreds of thousands of years ago Communication immediate and interactive Growth of Wernicke’s area and Broca’s area in brain?
Second Cognitive Revolution Literacy Tens of thousands of years ago Allows for reflection between communications Growth of Exner’s area in brain?
Third Cognitive Revolution Printing Press c. 1450 AD Revolution of scale
Fourth Cognitive Revolution Internet c. 1990s AD Near synchronous communication: brings together immediacy and reflection Are our brains being rewired?

This periodisation carries with it a sense of cognitive affordance offered by the new digital media of near synchronous communication. Ideas can be disseminated rapidly, enabling quick responses and affording collaborative thinking, but with enough time for some reflection before responses are made. The last column, linked to possible neural developments is highly speculative, and Harnad himself makes only tentative suggestions in this regard, but feeds into a contemporary literature suggesting that the move from page to screen carries with it some neural rewiring.

News and opinions are shared rapidly across social media, replacing traditional news sources. In a world where breaking news is not mediated by the professional journalistic eye, but blogged, tweeted and facebooked by participants, we need to reassess how we access knowledge about the world. Content is similarly mediated by new mechanisms. In a world where opinions are discussed through the vehicle of the share, the like and the Internet meme, we need to see knowledge as a commodity that is rapidly traded across the globe in relatively unmediated ways.

I think we can begin to discern the basis for an argument that we are seeing a revolution in cognition, a shift away from analogue, towards digital modes of communication and cognition.

Digital media do not replace analogue channels. As a species we still rely on our senses. Nevertheless, digital media represented by Internet and other telecommunication technologies, represent new modes of communication which have far-reaching consequences for how we communicate, think, learn, create and solve problems.

How our thinking is mediated, and the types of cognitive offloading that are afforded by these technologies may be represented by the following chart which compares Digital and Analogue Contexts:

Thinking Digitally Analogue Thinking Discussion
Hyperspace The five senses Push technologies in particular afford the delivery of information from sources not limited to our biological five senses.

 

For example I learn what the weather is outside from my cell phone as opposed to looking out the window.

Fast paced Slow paced Acquiring data in an analogue world is slow and deliberate, gathering data in a digital world is almost immediate and less considered. Speed has advantages and disadvantages and calls for different approaches to evaluating data.

 

I can use a search engine to find out about something rather than going to a public library and finding a book on the subject.

Multi-channel (digital media streams) Single channel (biological senses) While data in the analogue world comes at us via the multiple channels of our senses, the digital world opens up even more channels, some of them referencing analogue channels, others entirely in cyberspace. This has advantages and disadvantages and calls for different approaches towards data collection.

As I type this, I am receiving data from my cell phone, emails and alerts are appearing in my inbox and I am able to use the Internet to search for information. This is on top of what I sense about the world around me through my biological senses, which tends towards a single channel (my senses).

Parallel Processing Serial Processing The speed at which different channels of communication can be accessed affords multi-tasking. Of course, given the severe limitations imposed on our working memory, we can never truly multi-task, we can certainly take on tasks simultaneously in the analogue world. However, the plethora of extra channels extends our ability to work on many levels at the same time.

I tend to take on more tasks in parallel while using digital media. Partly because it is more efficient to do so, but partly because my mind is parcelling out sub-tasks which will require time because they are asynchronous communications – for example posting a query for help in solving a problem on a forum. While I wait for a reply, I write an email and watch a Kitty video on YouTube.

Hypertextual Screen Reading Linear/narrative Page Reading Readings in hyperspace are hypertextual rather than linear or narrative. Hypertextual reading affords rapid synthesis and evaluation. Linear, narrative readings afford a closely argued train of thought.

Using a website I can rapidly skim for relevant information by following links and using the Find or search function. If I want to follow a train of thought I have to read sequentially.

Fast and Wide Thought Slow and Deep Thought Multi-tasking and hypertextual reading affords the rapid synthesis and evaluation of ideas, the identification of patterns and Fast Thought. Linear and narrative readings affords closely reasoned consideration of a train of thought.

Most decisions we make are the result of Fast Thought, but digital media afford this by making more information available and allowing us to find patterns more actively. Page reading, however, affords deep thought because I have to follow the sequence of words in the order they were written.

The Reader Creates The Text The Writer Creates The Text Hypertext is epistemologically predisposed towards a constructivist paradigm where the reader constructs their own meaning from the world around them. Texts are created by the reader, following links.

Text is predicated on an author communicating and transferring a message, and is epistemologically neutral. The text is created by the writer.

Constructivist Pedagogical Paradigm Instructivist Pedagogical Paradigm Theories about how we think and learn have changed considerably over the last hundred years or so, with a shift towards theories stressing how meaning is constructed rather than perceived.
Learning by participation Learning by acquisition Sfard (1998) has argued that there are two metaphors for learning and both are valid – learning by participation, and learning as acquisition.
Symbolic Manipulation Symbolic Creation Fast thought affords rapid symbol manipulation, while slow thought affords the creation of new symbols.

Digital media allow us to rapidly manipulate information in many forms, but page reading allows us to generate new concepts.

Computational (Digital)

(Algorithmic) Thinking

Narrative & Metaphoric Thinking Jerome Bruner has shown the importance of narrative in our cognition. If we think about an issue we tend to do so by relating it. We also think metaphorically, comparing things to something similar in our experience. Computational thinking is something different – it operates at the level of simulation and modelling.

Digital media allow us to create simulations by modelling a problem and playing it out rather than thinking through a solution by relating its contours. This produces unexpected and counterintuitive results.

This schema is presented as polarities of what must inevitably form a continuum. Nor do I believe that they form a rigid dichotomy. Far more accurate to say that there is a dialectical relationship between the two.

The Foundations Of Digital Thinking

This characterization of twin polarities, operating in a dialectical (dialogic) relationship rather than in opposition, allows us to argue that there are several elements in human thought which, if not new, are at least being emphasised and foregrounded by the growing centrality of computers and ICTs in our lives. If we accept that even more important than our abilities and capacities are our dispositions, our ingrained behaviours and responses to the problems and challenges we face, it becomes clear that how we react to this shift in technology from analogue to digital is crucial in what we can accomplish.

What elements then constitute the dispositions of successful users of computational and digital media? If we look at successful use of digital media I think we can start to identify a number of elements.

Embracing Change

We all know that some people seem to embrace change and use it to their advantage, while others appear phased by change and shy away from it, or appear less imaginative and productive in the ways in which they use it. This has been falsely framed as a generational divide (Prensky, 2001a); that those born after 1985 (digital natives) are somehow wired differently, and have a natural affinity with ICTs which older people (digital immigrants) lack. While our brains may well be wired differently, if I am following this debate correctly, neural plasticity implies that all users of new technologies, regardless of age, will experience changes. There is a great deal of research indicating that this notion of a differently wired, differently thinking generation is a myth (Kirschner & van Merriënboer, 2013). There is no evidence to suggest, for example, that a new generation of youngsters are predisposed, somehow, to self-directed, self-paced, collaborative learning because their brains are somehow wired this way.

The divide is really one of disposition. Some youngsters are “immigrants” as much as some older people, and some older people are completely at home with technology. Exposure to, and comfort with a technology does not necessarily imply a capacity to use the technology in the most beneficial ways, in and of itself. For example, technology may afford self-directed learning, but it is one’s dispositions which determine whether or not these opportunities will be taken up. These dispositions generally need to be taught, and herein lies the central paradox of education. Discovery learning, as attractive as it sounds is ultimately a deeply flawed idea because of the scholar’s dilemma; how can you discover something until you know it is there to be discovered?

For example. Our brains, constrained by cognitive load, limit the ability to multi-task (Kirschner & van Merriënboer, 2013; Sweller, 1988). Our dispositions and habits, however, may allow us to handle multiple tasks with greater facility, overcoming the limitations of our biology to some extent.

Successful thinkers tend to embrace change rather than balk at it. Thinking Digitally means that you are constantly ready to adapt to change, and see it as an opportunity rather than a constraint.

Production vs Consumption

They say a picture is worth a thousand words. There is all the difference in the world between creating and editing pictures and viewing them. It is not, however, that the one is good and the other bad. There are skills and sensibilities, dispositions and capacities which are involved in both activities. To read a picture effectively requires a thorough understanding of the genres and contexts in which the picture was taken. Producing a picture also requires skills and abilities, understandings and literacies. I would argue then that we cannot privilege authorship above criticism. Both have value and a place. We are all at some stage or another a producer or a consumer of both analogue and digital technologies and products.

Nevertheless, there are good habits and dispositions around production, and likewise for consumption. These dispositions are likely to be very similar in analogue and digital contexts. Having said this, however, we need to highlight one vital difference.

All technologies require knowledge and skills both around production and consumption. But some technologies are clearly more complex than others and require specialised skills and knowledge. While almost everybody learns to read and write, very few learn to write software code. While most people can use a screwdriver, relatively fewer can use a soldering iron. This complexity factor limits and constrains the use of all technology. But digital technology represents a level of complexity which makes this division between people who can and people who can’t more stark. It is relatively rare to hear someone say of any analogue technology that it is simply beyond them and they cannot use it in any way. And yet this attitude towards digital technology is frequently heard.

A person who is relatively handy can tinker with, and fix a range of analogue technologies without ever being an expert in any of them. And yet, increasingly this tinkering is not possible with digital technologies as the technology requires such specialised skills and knowledge that the ordinary “handyman” cannot fix a problem.

I am no motor mechanic, but even I have been able to get a Volkswagen Beetle motor running again with a little logic, a prayer, and a spanner. There is no way that I could do the same with modern cars with their electronic complexity and sealed units. The growing complexity of our machines threatens to turn us all into helpless consumers, and robs us of any ability to tweak and tinker which an essential part of being an effective producer.

Personal computers were first introduced very much within a tinkering culture; the first computers were shipped with no applications – users had to create their own programs. These days the reverse is true – computers and devices are loaded with a wealth of applications, and programming is both unnecessary and discouraged. Production within each application is further consumerized by a move towards authorship using templates and wizards. Computing has moved from a strong producer ethos to one which encourages consumption. The rapid growth in computer usage explains this shift, but I would argue that as the user base expands so does the need to actively encourage producer dispositions, which are being lost by the ease of consumption.

Growing social complexity and knowledge specialization means that increasingly we are reliant on others to produce and fix, what formerly we could tinker with ourselves. This leaves us often helpless. While knowledge and skills specialization accounts for some of this movement from producer to consumer cultures, much of it is driven by underlying habits and dispositions. We can see in the Maker Movement (“Maker culture – Wikipedia, the free encyclopedia,” 2016) and in the thinking behind Computational Thinking, a growing reaction to this trend. As analogue technologies become increasingly digitized, I would argue that the need to foster producer mentalities and dispositions should be seen as a core aspect of Digital Thinking.

Successful thinkers display dispositions towards producer mentalities. Thinking Digitally means that you are able to Hack Your Life.

Creativity vs Drudgery

We face a future in which Artificial Intelligence will transform all aspects of our economy and society (Cellan-Jones, 2014). While we do not know what the future holds in store, I would argue that we need to see the growing technologization of society as an opportunity, and need to emphasise a disposition towards creativity rather than drudgery. Technology may be used to free up our lives for leisure and creativity, or for drudgery and work. Andrew Feenberg (1991) described the central contradictions between the potentialities for control and democratization inherent in technology, and pivotal to speculation over the role of technology has been dystopian and utopian visions of the future linked largely to these polarities.

Do we face a future in which a few use technology to control and pacify the many, in which humanity at large is debased by a digital divide in which they are passed over, or will technology free the majority from lives of drudgery to unlock their full potential? This is very much a political choice, and depends upon our social organization. The sharp debates around the politics of Austerity reveal the contours of the choices we have to make within the new world order, whether to accept Neo-Liberal Taylorist arguments about the nature of Capital, or to seek Socialist alternatives and a more equal distribution of wealth. The stark choice is whether we valorize capital above labour, or see technology as an opportunity to unlock human potential in ways not previously imaginable.

We urgently need a disposition towards creativity over drudgery in the ways in which we approach our use of technology.

Jane McGonigal (2011) has written about what she calls the four superpowers of games, and how these powers may make a better world. These superpowers form perhaps the dispositions of successful gamers. McGonigal argues they are the crucial dispositions for life itself. They are:

  1. Urgent Optimism
  2. Blissful productivity
  3. Social Fabric
  4. Epic Meaning

Her characterization of gaming as empowering in gamers a hopeful expectation of success in conjunction with productive work within a social network to accomplish meaningful tasks expresses a disposition towards optimistic, humanistic outcomes which resonates with a view of human purpose which is vastly at odds with the Malthusian vision of humanity as mere cogs in a machine.

Successful thinkers display dispositions towards creativity and unlocking their human potential. This requires constantly finding new ways to express yourself through technology, rather than allowing it to be used to marginalise your humanity. Thinking Digitally means that you are alive to the human within the machine.

Conclusion

In this exploratory paper I have attempted to argue that the most crucial element of our emerging relationship with technology is the dispositions and habits of mind we bring to our use of machines to automate and extend our thinking. The digital divide is often conceptualised as a divide around access to devices, but in reality it is a mental divide between those whose dispositions empower human creativity and problem solving, and those whose habits of mind limit the realm of the possible.

Bibliography

Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Castells, M. (2011). The Rise of the Network Society: The Information Age: Economy, Society, and Culture. John Wiley & Sons.

Cellan-Jones, R. (2014). Stephen Hawking warns artificial intelligence could end mankind – BBC News. Retrieved April 8, 2016, from http://www.bbc.com/news/technology-30290540

Clark, A., & Chalmers, D. J. (2010). The Extended Mind. In The Extended Mind (pp. 26–42).

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Dror, I. E. (2007). Cognitive Technologies and the Pragmatics of Cognition. John Benjamins Publishing.

Dror, I. E., & Harnad, S. R. (2008). Cognition Distributed: How Cognitive Technology Extends Our Minds. John Benjamins Publishing.

Feenberg, A. (1991). Critical theory of technology. Oxford University Press, USA.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Kahneman, D. (2011). Thinking, Fast and Slow. Penguin UK.

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do Learners Really Know Best? Urban Legends in Education. Educational Psychologist, 48(3), 169–183.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Maker culture – Wikipedia, the free encyclopedia. (2016, March). Retrieved April 8, 2016, from https://en.wikipedia.org/wiki/Maker_culture

McGonigal, J. (2011). Reality is Broken: Why Games Make Us Better and How They Can Change the World. Random House.

Miller, G. A. (2003). The cognitive revolution: a historical perspective. Trends in Cognitive Sciences, 7(3), 141–144.

Prensky, M. (2001a). Digital Natives, Digital Immigrants: A New Way to Look at Ourselves and Our Kids.

Prensky, M. (2001b). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Reading A Book or An Hour of Code? (2016, March 22). Retrieved May 10, 2016, from https://lh3.googleusercontent.com/-bQWsEynX-8A/VgGCHw2AdrI/AAAAAAABWTU/_y9VVvBQYgM/w1728-h1728/computational-thinking-white-bg.jpg

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A. (1998). On Two Metaphors for Learning and the Dangers of Choosing Just One. Educational Researcher , 27(2), 4.

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

(Bruner 1991)Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Prensky, M. (2001). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A., & Anna, S. (2014). Metaphors for Learning. In Encyclopedia of Science Education (pp. 1–4).

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

 
Comments Off on Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Posted by on December 6, 2019 in Computational Thinking, Critical thinking, Habits Of Mind, Learning Theories, Pedagogy, Thinking Digitally, Thinking Skills

 

EduTech Africa 2019 – Coda

Last week I attended the EduTech Africa 2019 Conference in Johannesburg and would like to wrap up my thoughts on the conference with a few observations. Now that the dust has settled the thing that sticks out most in my mind is the clear recognition of the rise of Computer Science as a K-12 academic discipline. The government’s commitment to rolling out IT as a subject, and the focus on coding across all age groups has established a clear sense that Computational Thinking and Computer Science belongs in the core curriculum in all schools. The big question is then how we get there. The announcement recently that PISA Assessments, which offers international benchmarks in Maths & Science, will now include Computational Thinking and Computer Science is confirmation of this. Most of the talks I attended addressed the issue of how best to teach Computer Science in some form or other. Passionate teachers shared their best practice, and their failures. So, the coda to my reflections on the conference is really to address that question. Is there a best method to teach Computer Science?

NS Prabhu (1990) in answering the question of whether there is a best method of teaching or not, concluded that the key factor in teaching success lay with the teacher’s sense of plausibility, the teacher’s sense of self belief that what they are doing makes sense, how passionate they are. There is clearly a great deal of plausibility around the teaching of Computer Science at the moment. Obstacles are being dealt with as opportunities, and there is a very real sense that inventiveness and creativity can overcome the constraints of budget and lack of training.

The clear consensus amongst teachers seems to be that physical computing forms the best approach. Most presentations highlighted the use of coding in conjunction with 3D printing and robotics. My very first exposure to teaching computing was with Seymour Papert’s (1980) logo system. I did not have the turtles, using only the computer interface, but I tried to make it more concrete by using physical cards with shapes students had to emulate. Computer Science is a very abstract subject and needs to be concretised for students as much as possible. The cost of all the kit needed to do this is prohibitive.

I recently came across micro:bits which uses a web-based platform for coding. The code created is then downloaded as compiled hex code to the microbit chip which executes the code. But crucially it also has a web visualisation tool, which executes the code in the code editing window. The micro:bit controllers are themselves fairly cheap, but having a visualization tool means that more students can code at any one time. A class would need fewer physical chips at any one time. I have not yet been able to test the real thing, but it seems to me a perfect fit for the kinds of physical computing tasks I would wish to introduce. It uses a block coding interface, but you can toggle to program in JavaScript or Python, making it ideal for transitioning between block-based coding to the text-based fare students will need higher up the school. You can also design 3D printed parts for interesting projects.

But I digress, back to best methods. Another strong thread in the conference was computing for problem solving. I have to say that I am a little dubious about the whole Computational Thinking leads to better problem solving generally. I believe it leads to better problem solving in computational contexts, but transfer of skills from one context to another is always problematic in my view. Nevertheless, I do believe that students should be given real world problems to solve as far as possible and Computer Science teachers are leading the way in envisioning how coding could form a central plank in cross-disciplinary problem solving exercises. There was a great deal of talk at the conference about the need for teachers to “come out of their silos.” There is certainly no need for CS teachers to set projects divorced from the real world, or set problems narrowly about computers.

The final method that was raised at the conference was unplugged computing, an approach which involves modelling algorithmic thinking without a computer. For example students might be asked to write code to control a class-mate acting as a robot to perform a certain task. A talk by a primary school teacher on coding in the junior years had us all playing rock, paper, scissors. I’ve forgotten why, but it was great fun!

 

In the end, my take-away from the conference was to think about the best approaches for my own classes. And most particularly how to integrate all three of these approaches better. To my mind this is the best sort of take-away!

 

Bibliography

Papert, S, 1980. Mindstorms : Children, Computers, and Powerful Ideas. Basic Books. https://dl.acm.org/citation.cfm?id=1095592.

Prabhu, N.S, There Is No Best method – Why?, TESOL Quarterly, vol. 24, issue 2 (1990) pp. 161-176
 

Developing Tools to Help Students Construct Meaning in Computer Skills

As a teacher of computing applications I have found that the area my students struggle with the most is creating and using spreadsheet formulae and database queries. That is to say they struggle most where they have to apply mathematical formulae, which are by nature abstract, to a concrete task such as applying a 10% discount if certain conditions pertain. The ability to move seamlessly between abstract and concrete is not something all students possess. Piaget described the movement between concrete and formal operational thinking as a maturational process, with children only becoming capable of abstract thought at around 12 years of age. It is also thought that abstract thinking develops into adulthood as individuals gain more experience with it. This suggests that students need extensive scaffolding to help abstract thinking skills develop. It is also clear that it is difficult to generalise concepts across different contexts generally.

I have looked at the Semantic Wave Theory previously on this blog (eg. Maton, 2014), a framework drawn from Legitimation Code Theory, which shows how the movement between the abstract and highly condensed to the concrete, contextualised and simple can be used as a tool to show how meaning is being unpacked and re-packed within the classroom. Researchers have shown how successful teaching and learning depends on creating repeated movements between the two, describing semantic profiles.

The diagram above illustrates various semantic profiles, which will be instantly recognisable to any teacher. The high semantic flatline is when all discourse in the classroom remains at a general and abstract, very theoretical level. The low semantic flatline is when discourse is simple and practical. Clearly what is needed over time is movement between abstract and concrete, complex and simple, a wave-like graph with greater semantic range. Teachers need to help students understand complex, abstract ideas by unpacking the concepts using concrete examples, personal experience and metaphors. Students also need to learn how to repackage their understanding in more abstract academic language in their own words, and teachers need to carefully scaffold, this process.

Understanding semantic waves, helps to understand how best to scaffold spreadsheet formulae and database queries by finding strategies to strengthen and weaken semantic gravity and density as it is called, in other words to scaffold movement up or down the semantic wave. To do this requires an understanding of the relative strengths of semantic gravity and density in various computing applications. I have to say that this is in itself not an easy task. It seems to me that what appears to be a concrete, practical task for an experienced practitioner, often appears abstract and complex for the novice. This is perhaps just another way of saying that as we get used to traversing the gap between abstract and concrete we get better at doing it, and cease to notice it, or struggle with it. We operationalise abstract formulae without a second thought and it seems like a simple, concrete task to us. We need to try and see it from the perspective of the novice. The novice needs to bring together an understanding of what the computer needs to do expressed in plain language, the mathematical or logical language of the problem and the syntax of the application or programming language. And this process needs very careful scaffolding and support.

I have very recently come across a cognitive tool called the Abstraction Transition Taxonomy (Cutts et al, 2012). The illustration below comes from the paper cited and demonstrates one way of visualising the processes involved in coding a computer program, or indeed an excel spreadsheet.

This design process helps bridge the gap between understanding a problem and its solution and translating that into a working program which then needs to be debugged and checked to see if it does what it is supposed to do. The key stage is the story boarding in the middle. I like to think of the steps shown above as following the following stages:

  • Plain Language: Think about the problem and work through a solution in your mind
  • Maths/Logic: Build in any mathematical or logical operators into your solution
  • Application Syntax: Implement your solution using the particular syntax of the app or programming language you are using.

For example:

  • If a class has collected the most money in the school, they get the day off school.
  • If money collected = most money, then day = day off, else day = normal school
  • =IF(cell=max(range);”Day Off”;”normal school”) [in an Excel spreadsheet]

It is tempting to see each of these levels (plain language, maths/logic, app syntax) as discrete strengths of semantic gravity, moving from plain language (strong semantic gravity) to maths/logic (weak semantic gravity) and then back to app syntax (strong semantic gravity). This would describe a wave much like the graph shown below. This is a useful way to conceive of the shifts in levels of abstraction while using a computer to solve a problem.

Over the years teaching spreadsheets, databases and coding, I have come to develop a routine of modelling how to go about using computers to solve problems which follows the three steps enumerated above. It is summarised as the ELS method:

  • State the problem and solution in plain English
  • Plug in any mathematical or Logical operators
  • Enter it using the particular Syntax of whatever application you are using

This helps students, I think, by giving them a process to follow and helps move up and down the semantic range, but my grade 8s and 9s still struggle to apply it.

Although the three step process helps build in a movement up and down the semantic range, it is not enough. Each step represents a semantic range in its own right, for the novice at any rate. When stating a problem’s solution in plain language, one needs to hold in mind the contextual parameters of the problem and an ideational, abstraction of the solution in one’s mind. When working through the mathematical and logical expression of the solution, one needs to continually jump back to the context of the problem and forth to the emerging formula. When translating this formula into the particular syntax of the application you are using, also requires rapid jumps back and forth up the spectrum between weak and strong semantic gravity. Although the curve above may well describe the overall movement of meaning in the task, it seems to me to be made up of rapid oscillations back and forth between two states, abstract and concrete, a kind of quantum wave, if you like, as the student superimposes an abstract solution on top of a concrete problem’s solution. I believe it is this which makes it particularly difficult for novice programmers and spreadsheet/database creators to navigate the coding of a solution. More experienced programmers handle these shifts with ease.

How and Why Questions help move up and down the semantic range

When using the ELS method in a whole class situation I model the mental process of thinking through the task very closely, drawing on student contributions. But getting students to work in pairs is also very necessary as it forces them to voice their mental processes and this helps strengthen and weaken semantic gravity. If you are explaining why you think the formula should be this rather than that, you are effectively making jumps up and down the semantic range because you are dealing with why questions, which tend to raise the level of abstraction, and with how questions which help concretise your solution. When you try something and it doesn’t work, having to discuss possible reasons with a peer helps do the same.

Bibliography

Cutts et al., 2012. The abstraction transition taxonomy: developing desired learning outcomes through the lens of situated cognition. In Proceedings of the ninth annual international conference on International computing education research. ACM, pp. 63–70. Available at: https://doi.org/10.1145/2361276.2361290.

Maton, Karl. (2014). A TALL order?: Legitimation Code Theory for academic language and learning. Journal of Academic Language and Learning. 8. 34-48.

 

 
 
%d bloggers like this: