RSS

Category Archives: Habits Of Mind

Does The Great Onlining Offer Opportunities to move from Teaching Content to Teaching Thinking?

One often hears the view expressed that one of the benefits of the enforced move to teaching online is that it will entail a move away from teaching content, and open up opportunities for a new vision of teaching that foregrounds students’ problem-solving skills. One of the many educational trends that have been rained down on teachers like the ten plaques of Egypt, is the idea that content is outdated, and that what counts in the Twenty First Century is Problem-Solving or Thinking Skills. It is an idea that has become all pervasive. At every Educational Technology Conference I’ve ever attended, at some stage a keynote speaker will express this point of view. Especially if they come from industry. “What we need is not people with paper qualifications,” they say, “it is people who can think and problem-solve!”

But can thinking be distilled from all context and taught as something discrete? Knowledge is changing so fast, the argument goes, that it will become outdated as soon as you teach it, and therefore what we need to be doing is teaching students to think, rather than teaching them content. This idea is seductive because of course it appeals to a kernel of truth. Knowledge is changing really fast. What I learned about the structure of the atom in high school is certainly not what is taught today! And yet the notion that somehow education’s core business has suddenly changed is somewhat ludicrous. Did teachers not teach students how to think pre-millenium? What does thinking that is separated from content look like, anyway?

My own career as a teacher has been affected by this movement towards explicit teaching of thinking. I teach a class called Thinking Skills. In this class we use problem-based approaches together with introducing the Harvard Visible Thinking Routines and cognitive tools such as the De Bono Thinking Hats, David Hyerle’s Thinking Maps and Costa and Kallick’s Habits of Mind. These thinking tools and strategies are embedded in every school subject, but the purpose of the Thinking Skills class we do in our grades 8-10 is to give importance to thinking itself, and to provide a platform for explicit teaching of the range of cognitive tools we use across the school. I am in two minds about how effective this is as an approach. Thinking, after all, is always about something. Thinking divorced of content makes no sense, and thinking always has a context. How you think as an historian, a musician or as a scientist is different. Learning to think in one context surely confers benefit, and surely fuels habits and dispositions which are transferable to other contexts. But how this happens is not easy to pin down, or easy to demonstrate. Nor is it automatic. We assume that it happens, but we cannot definitively demonstrate that it does. We hope that an awareness of different cognitive tools, and familiarity with using different thinking strategies will improve our students’ thinking skills. We try to teach them to notice when they need to reach into their cognitive tool kits, and develop their capacity to reflect on their own thinking, and to become better at choosing appropriate cognitive strategies. But all the documentation in the world does not add up to proof that this is effective. And as much as I think the Thinking class I teach is useful, I do not believe it supplants Maths or English classes in any way. Students still need to learn to think like a mathematician, or think like an artist!

There is some anecdotal evidence of course, that our approach to cognitive education does work. Visitors to the school express amazement at how well our students engage with problem-solving tasks. As encouraging as this feedback is, it does not amount to proof. The benefit of an explicit Thinking course is not really about improving performance in other subjects, the aim is to improve the ability to think in any context. I think what students enjoy about it is that they get to think about real-world problems without the pressure of assessment or swotting. I think it is also important in that it signals that what the school values is thinking, and the development of thinking dispositions. I believe that this approach has benefits because solving problems helps improve the ability to solve problems. Not least it builds confidence in the ability to solve problems. As anyone who has ever tackled problems like crossword puzzles, for example, will know, once you start to understand how the puzzles are set, and develop strategies for solving the clues, the easier it becomes to work through the clues. And even a difficult seven across will be tackled with a level of confidence that it can be solved given enough time. The ability to solve crosswords does not necessarily make one a better problem-solver in another context, such as Chess problems. One can be quite good at solving one type of problem, but quite bad at another. In our class we try to tackle different types of problems and help students develop strategies and tools for approaching problems. The hope is that each student will develop a sizable toolkit of cognitive tools, and an awareness of which tools are good in different situations.

So, whilst I believe that teaching Thinking has value, I do not believe it can be done divorced from the curriculum. At my school the explicit teaching of thinking is limited, we wish it to be embedded in our curriculum, rather than becoming the focus of the curriculum. It would be lovely to believe that the move online would allow teachers to throw off the yoke of curricula and standardized testing and teach students to think, to problem-solve. Sadly I do not think that it does. It is rather naive, to believe that students, simply by doing an online project rather than more formal classes, will develop thinking skills miraculously. Thinking skills need to be carefully scaffolded and nurtured. Even in a Thinking class tasks are contextualised and we seek to draw students’ attention to opportunities for transferring their skills across the curriculum. As any teacher who has ever set an open-ended project will know, the success of the project depends on how carefully it was scaffolded and supported. Remote learning will not suddenly unlock hidden abilities in our students. If we want those abilities to emerge we need to put in the pedagogical work to develop them. And remote teaching is hard, it is hard enough teaching the regular curriculum.

Doing the kind of work needed to foster advanced thinking skills over Zoom?

I don’t think so.

 

The Importance of Teaching Media Creation Skills

There is an abiding myth that kids today are born digital natives. Anyone who has ever taught ICTs in any form will know that this is simply not the case. Digital skills very much have to be taught! Kay and Goldberg have described computers as a metamedium, a medium, in other words used in the creation of other media. As such it would seem axiomatic that computing should be taught to everyone. And yet this is far from the case. All over the world computing has to fight for a space in the curriculum. No doubt much of this contention stems from the expense of acquiring computing resources, and from securing adequately trained teachers. The great onlining of education has shown us the importance of computers as a medium of communication, but as a medium of creativity it can scarcely be less important. I have taught PhotoShop, Flash and Dreamweaver for many years, often in the context of web design, or game creation. I find that it is an excellent way to segue into coding for middle school students. Computers can be used to create all manner of digital content, but games are particularly alluring for students.

In this blog post I would like to walk through my thoughts about how the nature of remote teaching will have to change my curriculum and instructional design. I would like to cover the same basic concepts: namely photo-editing and game design introducing elementary programming procedures.

Starting with image manipulation in PhotoShop one can teach not only photo-editing skills, but also copyright issues. I usually teach students to use the Creative Commons Search Engine to find suitable images to use that are copyright free. There are many plarforms available for games creation. Up until last year I used Flash, despite the increasing difficulties as the platform becomes less and less supported. I have been considering using Scratch instead, but the seamless integration inside websites and the ability to run in a browser still made Flash a viable choice. My school had an Adobe licence, so justifying that expense was also a concern. I usually teach students how to create buttons in flash and use interactive behaviours. This requires starting to use ActionScript. We use existing scripts and learn how to tweak them. After a few tutorials I get the students to design their own games and then help them get it to work. The graphic shows one of the games created by students which depended upon drag and drop behaviours to work.

So, here’s my problem. I am due to start teaching this unit in May with my grade 8 class, and yet we are likely to be on lockdown, and I am wondering if it is a unit of work I can teach remotely. Certainly not with PhotoShop and Flash, as students are unlikely to have the Adobe Suite. But apart from the problem around access to the software and the necessary data or devices – most of my students use iPads if they do not have a laptop. This presents a number of problems. Firstly, I will be really sad not to have the linkage between image editing and games creation. Realizing that everything about remote teaching and learning takes longer, I will have to concentrate on the game design alone. For remote teaching an online Photo editor such as Photopea appears to work well. The crucial skill is removing a background and saving as a gif with transparency. I am not sure that I will be able to adequately support students through photo-editing online, and the games design, however. So I will have to play this aspect by ear.

In my experience getting students to the point where they can design their own games requires a good few basic tutorials teaching base skills, and then a great deal of scaffolding the process of discovery, especially where it requires coding beyond my own capacity! Tackling this online presents problems. It is difficult to help students debug their code when you can’t see their screen, or where you have to reconstruct it to test it on your own screen! It also needs to be something that can be done on an iPad if a student does not have access to a laptop or pc. It should also not involve any downloading of software or purchase of an app.

So I have decided to use Scratch on the MIT platform which works inside a browser, and apparently works fairly well on an iPad and allows students to use a free account. Students can also share their projects with others. This is crucial because I would like students to work in small groups. I usually get students to do a few tutorials online and then set the project as a group project. Working with groups might prove tricky during remote teaching and learning, but might also help overcome some of the isolation of working from home.

To test the versatility of the platform I created a quick pong game and a tamigotchi game, and it seems to me that Scratch works very well at enabling game creation. The platform also has tutorials which allow for students to work on their own, and develop capacity beyond any tutorials and tasks I create for the class. It also has an extension for the BBC micro:bit controller, which I use for robotics. I have not been able to explore this, but it seems to me that it creates some potential tie-ins, which is important. I also use the MIT platform for mobile app design with my grade 9s, so using Scratch on the MIT platform to introduce coding seems a good fit all round.

To my mind the key to instructional design in a case like this is to have a programme in mind which can be cut short, or can be extended, depending upon the time available and the capacity of the students. In this case the vagaries of remote teaching becomes a particular concern. I will write a follow up post after completing the unit.

Bibliography

A. Kay and A. Goldberg, “Personal dynamic media,” Computer, 1977, pp. 31-41.

 

The Great Onlining – What we have learned about Remote Learning

While questions of giving teachers and students technological access has been important, as Morrow (2007) has pointed out, the majority of students have formal access to school, but lack epistemological access – access to adequate knowledge. When schools were closed and the country went into lock-down shortly afterwards, the twin tiers of our unequal education system became terrifying visible. State schools and universities by and large went on early holiday, and little remote education was possible given the huge deficit in access to suitable devices, data or even electricity. As the lock-down continues, there seems to be little wriggle room around onlining education rather than keeping the schools closed and making up time later in the year by cancelling further holidays. On the other hand more privileged state and private schools went online, with varying degrees of success and were able to keep the school calendar intact. While there were challenges in getting devices and/or data to all teachers and students, much of the focus was on technical questions around which platforms to use to teach online rather than on access alone. This was a sine qua non. I am not convinced that all families had sufficient technological capital to cope with teaching and learning online. Parents and children often had to share devices and data. Internet connectivity was often slow and sites crashed, especially in the beginning when the large tech companies had to rapidly roll out resources to service the massive increases in demand. Many students had access to personal devices, but quickly discovered that an iPad or smart phone, while OK for completing tasks in the hybrid environment at school, were inadequate for the range of tasks expected of them when learning went online.

I honestly have no solution to the problem of giving adequate technological access to all students and teachers. This requires massive infrastructure investment by the government. While service providers have zero-rated certain educational sites, much of the country lacks the tech resources necessary to support online education and data costs are prohibitive. The teachers on the #ZAEdu twitter community have undertaken a number of initiatives to try and get zero-rated data costs on educational platforms and set up advice and assistance for teachers. Government and industry has been lobbied to take steps to enable greater access. This is vital work. But we also need to start to pivot towards thinking not just about how we get teachers and students online, but how we teach online in such a way that we can do something about reversing the trajectory towards increasing inequality. In other words we need to start moving beyond questions of formal access (to schooling, or internet infrastructure) to questions of epistemological access. If we do this right, we might be able to salvage something from the fire. It seems to me that those teachers who were able to teach remotely over this last three week period, or are starting now, have a huge role to play in terms of reflecting on effective pedagogies so that questions around the effectiveness of remote teaching and learning do not get ignored.

At the beginning of the year when teachers drew up their year plans, the assumption would very much have been that teaching would have been conducted face-to-face in the classroom. As these plans melt away, and teachers re-draw their plans, it becomes increasingly important to think very deeply about instructional design and pedagogy. If government and industry listens to teachers and starts enabling massive online access in South Africa, we as teachers need to be sure that we have figured things out so that the digital divide does not just shift from being about physical access to being about access to quality teaching online.

Those of you who follow my blog will know that I use Legitimation Code Theory (LCT)  as a lens for my research, and for understanding my own pedagogic practice. The LCT research Group at Wits University, led by Prof. Lee Rusznyak, this week held an online discussion around what LCT Theory has to tell us about remote learning generally and how we should be re-tailoring our curricula in the light of the Great Onlining. I would like to share some of this thinking, because I believe it is a useful intervention in the conversation right now. Having a theoretical perspective is important because it allows for a common language and tool-set for thinking about the problem at hand. LCT is particularly powerful in this respect because it looks at knowledge and knowing itself and allows us to use a common approach to thinking about how the nature of knowledge and of knowing impacts on education at every level, and across all fields.

But before getting into theory, I would like to frame this in terms of what I see to be the problem. As an overall comment, it is obvious that remote teaching, classroom teaching and specially designed online courses are three different animals. Each of these modes of pedagogical delivery has different sets of affordances and constraints. Put simply, there are clearly things you can do online that you can’t do in a classroom, and vice versa! Understanding these affordances and constraints is something that is largely dependent upon the subject matter being taught, and the context of each situation. One cannot make sweeping claims; the devil really is in the details.

Here’s an example.

One of the hardest things for me as a teacher of computer coding in middle school, large generalist classes, has been finding a way to help students debug their code. The initial teaching transferred quite nicely online via videos of live-coding on the interface. I used screen-cast-omatic to record my screen, with an inset of my talking head in one corner. The ability of students to pause and rewind and see my screen up close may even have been more effective than the same thing in the classroom via an interactive whiteboard. But pause and rewind is not the same as being able to ask a question or hear questions from peers that you hadn’t thought of. So I also had meeting check-ins on Teams in which I was available to answer questions and share my screen in response to queries. I recorded these so that students working asynchronously could also view this form of content delivery. But accessing student code to help them de-bug it was problematic. Most of my students use iPads and find the coding itself difficult on these devices. In class they use desktops. Sharing screens is awkward in Teams and helping students debug mistakes via email or the chat stream is definitely not the same as being able to see a student’s screen and help them notice where the error lies.

On the other hand, my colleague, who teaches programming to specialist, smaller high school classes has found it relatively easy to tackle these issues. Her students all have laptops rather than iPads, and see her every day rather than once a week! She has found that screen-sharing is much easier with a handful of students and makes debugging code easier! The devil is in the detail! Context is everything. Sweeping generalisations are not really that helpful. Nevertheless, we have to start somewhere.

What MOOCs have to tell us about remote teaching!

Massive online learning initiatives have a relatively long history now, and what they show us quite clearly, I think, is that the standard format of short videos delivering content knowledge with quizzes checking understanding works fairly well.  Lectures can be enlivened with graphics and visuals far more engaging than the normal chalkboard. Knowledge can certainly be presented comprehensibly, but possibly only if the students already have a good basic knowledge to build upon. The drop-out rate for non post-graduate students on MOOCs is massive. Modular stackable instructional design may work well if you have a knowledge base to stack upon, but may not work well for those seeking to build that base of knowledge. It seems to me that MOOCs are pretty good at extending knowledge, but not that effective at building knowledge. Those who believe that massive online content delivery to high school students can be built on expert teacher videos streamed across cell-phones are not thinking about the pedagogical work that needs to go with this. I have taken quite a few MOOCs over the years, of varying quality. To my mind what appears absolutely crucial is the digital presence of the teacher. The presence of the lecturer and/or Teaching Assistants is what really makes or breaks a MOOC. The use of online Google hangouts, social media chats or more formal check-in times allows teachers to help students navigate the content effectively. To build knowledge, students need to make connections and links between ideas, between abstractions and real-world examples, between simple building blocks and larger theoretical approaches. Students need help doing this, and unless the teacher invests a huge amount of pedagogical work in terms of being present digitally, MOOCs tend to fall short. As a general rule, though, there appears to be no reason why online delivery of content knowledge cannot be done effectively, albeit with caveats around difficulties in linking ideas.

What Classrooms have to tell us about remote teaching!

I think we also need to understand what classrooms do well, and where they fall short. Classroom interactions allow teachers to monitor students much more effectively than online platforms do. A teacher can literally see where students are engaged, and when they goof off, can monitor their progress on a task first-hand and intervene with much greater flexibility. They can not only respond to questions, but can also often sense when a student wants to ask a question, or can sense when something has not been understood. Teachers can gauge when to wait for a student response, and when to step in an answer the question, or when to rephrase it. In other words classrooms are pretty good at affording the reading of social cues, much more difficult, if not impossible online.

However, delivery of content knowledge is often compromised face-to-face by any number of factors. It is difficult for a teacher to compete with the graphic capacity of digital media, or to repeat content endlessly. There comes a point where the lesson ends, and it cannot be rewound or paused. If a teacher makes mistakes in presenting material, the moment can be lost. Videos can be more carefully scripted, rehearsed, edited or re-shot. While teachers can compensate for these mis-firings and interruptions, students often leave a class uncertain about what they have heard and no chance of a replay. There is a possible case to be made that classrooms are better at the social than the knowledge bits.

The following sections look at three key concepts within LCT. If you are feeling brave, or are familiar with LCT from previous blogs, then I would suggest reading what follows closely. If you are feeling less up to the task of difficult or new theory, I would suggest skipping the explanations and reading only the bits in italics.

What LCT has to tell us about remote learning!

I believe that the devil is always in the detail, and in the particular context, but I think it would be fair to say that the observations above about the forces at work online and in the classroom largely hold, or at least set out parameters that are useful in thinking about how to approach remote teaching. We need to recognize that the major strengths of online instruction revolve around effective delivery of content knowledge, but that social relations are severely constrained, while the major strengths of the classroom lie in affording social relations, but that content delivery may sometimes be constrained. Perhaps this is why research suggests that while classroom teaching beats online teaching, hybrid delivery equates with face-to-face in efficacy. In my normal practice I upload videos of all my lesson content which students can consult if they are absent, or if they need to: offering the best of both worlds?

LCT has a number of dimensions and each of them informs educational practice in useful ways, and I believe helps us to navigate the maze that is remote learning.

Specialization

A key concept within LCT, an approach developed by Karl Maton (2013) from the work of Pierre Bourdieu on Knowers and Basil Bernstein on Knowedge, is that of specialization, that different fields have different codes which represent what makes for legitimate knowledge in that field. Maton argues that all knowledge is made up of both knowledge (epistemic relations) and knower (social relations), but in different fields the emphasis is different.

  • In Science, for example, the knowledge is foregrounded. Who you are is relatively unimportant, having the knowledge is what legitimizes you as an expert in the field. It is a Knowledge Code.
  • In the field of English Literature, however, the rules of the game are different. What you know is far less important than having the right gaze, being the right kind of knower, having the right feel, the right eye for it makes you a legitimate knower. It is a Knower Code.
  • Relativist Codes are where neither epistemic or social relations are foregrounded, personal opinion is what counts in everyday discourse, for example.
  • Elite Codes are where both the knowledge and the ways of knowing are crucial: Music or Architecture.

From what we said about the affordances and constraints of online and classroom spaces above it would seem that different fields will experience code matches and code clashes when moving into an online space. Knowledge Code fields might find online environments offer more matches because knowledge is what is foregrounded. Knower code fields may find more obstacles because social relations are constrained. The devil will always be in the detail, and I am not saying that resourceful teachers will not be able to navigate these difficulties successfully and inventively. But they seem to me to represent underlying forces which inform online practice.

Teachers might be well-advised then when planning what to do during what might be an extended period of online teaching to take those bits of their syllabus which sit better with knowledge transmission than with knower building. An English teacher may plan to use Zoom meetings to read a text with their class, but then find half the class has not been able to get online, and the whole experience may become a nightmare. Perhaps it might be better to choose the more knowledge-heavy bits of the syllabus which can be tackled asynchronously if needs be.

Semantics

Another key concept within LCT is the dimension of semantics. The key idea here is that meaning can be analysed in terms of semantic gravity, how abstract or generalized an idea is or how concrete and contextualized an idea is and semantic density, how complex or simple an idea is. Research in LCT seems to suggest that cumulative knowledge building is predicated on movements over time between abstract and complex and concrete and simple called semantic waves. These waves could describe the course of a lesson, of a semester course plan, or a worksheet or student essay. Teachers need to make sure that these connections are being made regularly and in both directions.

For example, when a teacher explains a concept they will unpack the idea by explaining it in everyday language, giving concrete examples, using metaphors so that students can understand it. This movement between abstract and complex and concrete and simple represents what is called a down escalator. Essentially the teacher is mediating difficult concepts and helping students understand the concept by reformulating it in language and ways that are easier to understand. But cumulative knowledge building depends upon ideas being connected and understood as part of a larger whole. And upon students being able to take their raw understandings and repackage them in more academic language and understandings, in other words making up escalators.

Successful curricula describe semantic waves connecting the theoretical with the practical, the abstract with the concrete, the complex with the simple. Common semantic profiles are shown here. Often understanding remains at a theoretical level ( a high semantic flatline) or at a simple and practical level ( a low semantic flatline). Successive down escalators represents knowledge that is understood, but remains segmental, unconnected with new understandings built by students over time.

Any curriculum design would clearly aim at building semantic waves over time, connecting and consolidating knowledge, grounding theory in practice, grounding abstract ideas and concrete examples to further understanding. But online coursework may not offer affordances for this kind of cumulative knowledge building. While short videos unpacking single ideas are certainly do-able, the kind of pedagogical work necessary to sustain extensive cumulative knowledge building is heavily constrained online.

As Rusznyak has pointed out in the discussions around this held online at Wits, it may well be necessary to re-plan curriculum design to maximise the affordances of that portion of the year spent on remote teaching, and do the connecting of the semantic waves later in the year when classes resume in person. Alternatively some subjects may find it best to describe high or low semantic flatlines, and build semantic range later in class.

Since some portions of the syllabus might lend themselves better to different semantic profiles, teachers need to think carefully about how best to sequence and pace their syllabi.

Autonomy

The last dimension to be unpacked by LCT scholars has been the dimension of autonomy. I don’t want to go into too much detail here, because the discussion so far has been quite dense. But essentially autonomy looks at  the extent to which practices are inside or outside a field (strong or weak positional autonomy) and the purpose to which they are put (strong or weak relational autonomy). For example, in a Science class, the class might be doing Maths (which falls outside the field of Science), but it is being turned to the purpose of doing Science. Or in a Maths class the teacher might be talking about cricket (outside Maths) but using it for the purpose of understanding a parabola (for the purpose of Maths).

Several codes are described:

  • the sovereign code – for example doing Maths for the sake of Maths
  • the exotic code – content clearly outside the syllabus for purposes that have nothing to do with the curriculum
  • the introjected code – in which non-curriculum content is turned to the purpose of doing the curriculum
  • the projected code – in which curriculum content is turned to other purposes, eg. for the world of work

LCT research has indicated that good educational practice involves tours through different codes. For example using exotic material for sovereign purposes by introjecting, or projecting sovereign content, showing how it is useful in other purposes.

A concern has been raised that as teachers race to put material online for remote instruction, material will be positioned far too much in the sovereign code. This represents an all work and no play approach which has proven the kiss of death to much of the drill-and-kill type of digital content that has been produced for educational consumption. Teachers need to make sure that their online offerings retain the same kinds of introjection and projection that they employ in their normal classrooms.

Conclusion

This blog post has presented a great deal of dense theory, but I hope that the theory has been turned to the purpose of illuminating the kinds of instructional design decisions that teachers will need to make and the kinds of things they need to be thinking of as we move from thinking about the technology of teaching, and thinking more about how our pedagogical decisions can give students greater epistemological access.

Bibliography

Maton, Karl. (2013). Knowledge and Knowers: Towards a realist sociology of education.

Morrow, W. 2007. Learning to teach in South Africa. Cape Town: HSRC Press.

 

 

The Great Onlining – From Digital Natives to Digital Aliens – Reflections after Week Two!

After two weeks of remote teaching, I have to say that mental exhaustion is starting to set in. I can only imagine how challenging it is for students as well. In last week’s blog I highlighted the problem of reaching students online who might not be able to be reached, or might not want to be reached. Technological problems aside, the very constraints of online platforms may make it more difficult for students to focus, find relevant instructions and resources or manage their time effectively enough to be able to complete much work.

Marc Prensky popularised the idea of the Digital Native, one who appears to have the natural, in-born disposition for digital applications. Prensky defined this as a set of dispositions stemming from age alone. Anyone born after a certain date was somehow imbued with technology in their bloodstream, so to speak. The rest of us, born before this date were digital immigrants, we would have to learn how to use technology through pain and sweat. This idea has been thoroughly debunked. Anyone who has ever taught children ICTs will attest to this. Children are not born with the habits, behaviours and dispositions neatly in place to make them natural born users of technology. And many older people take to technology like a duck to water. Nevertheless the concept of digital nativity, of dispositions, a gaze which predisposes the person towards digital use does seem to hold some merit. We all know people who seem to get it naturally, and others who will probably never cope with anything digital. Perhaps digital nativity is an acquired, cultivated or trained gaze – a way of looking at things which makes some people better at dealing with the new technologies than others. This disposition is not dependent upon age, but describes a spectrum from digital nativity to digital alienation.

When teaching online this becomes absolutely crucial because the medium of delivery is so dependent upon the technology. In my experience with hybrid classrooms, any class follows a law of thirds, although the quantification of that fraction changes from year to year, class to class and lesson to lesson. Students have different digital dispositions. One third I shall call the Digital Natives with apologies to Marc Prensky. This group is quite capable of working independently online. They can find and follow instructions, manage the resources left by the teacher and manage to ask questions where needed to complete tasks totally online. They don’t really need a teacher to tell them what to do, they have a capacity and disposition for discovery and an ability to figure things out quite quickly on a digital platform. This group tends to submit assignments without prompting on time, often well before the due date.

A second third, the Digital Immigrants need instructions to be in-the-flesh, so to speak. They struggle to locate resources or instructions online, but can cope with whole class instructions. If a teacher tells them what to do, and where to look, they can then work on their own. This group needs someone to foreground what they need to notice. But once this is done, they are happy to work on the task, although they do ask more questions, and need more scaffolding generally. A quick online check-in meeting may be all they need to get working.

A third group, the Digital Aliens struggle online, but also need any instructions given to the whole class to be repeated individually. Something said to the group only seems to be processed effectively when repeated once they are ready to process the information. This group may not respond well to instructions given in a group check-in meeting for example. They need to be taken aside individually and carefully guided through every single step. This is extremely difficult on an online platform. You really need a one-on-one meeting. This can be done in class more easily whilst circulating, but for a student struggling with the technology anyway, setting up an individual tutoring session can be well nigh impossible.

If this perception is correct, it has important implications for remote (and online) instructional design. It suggests that students from each of these groups really needs different strategies. In a face-to-face classroom teachers are able to manage these differences much more seamlessly, although it is never easy. Online, differentiating teaching is much more difficult. In the last two weeks I think I have started to get the hang of managing the Digital Natives and Immigrants. By posting instructional videos online ahead of a class the Digital Natives have a head start. Then I have check-in meetings at scheduled times where I can answer questions, share my screen and show students how to do things. I record these as well as some students seem to need the question and answer to make sense of it all. What is extremely difficult is trying to reach the Digital Aliens, most of whom do not check-in during scheduled times, or probably even watch the videos. Often reaching this group involves long tortuous emails in which I try to make sense of the difficulties they are experiencing and coax them onto the platform.

Sometimes this results in a eureka moment, but often it results in radio silence. I have sent out a number of emails in the last week which basically said something like, send me what you’ve got so I can have a look. Many of tehse remain unanswered, but I live in hope that week three will bring my break-through moment with the Digital Aliens!

 

Making Semantic Waves with Robots

Semantic Waves has emerged as an explicit pedagogical approach in Computer Science Education. For example The National Centre For Computing Education in the UK has released a Pedagogy Quick Read on Semantic Waves. In this blog post I would like to look at how I have been using semantic waves in my robotics classes. Semantic waves track the relative abstraction and complexity of ideas within a lesson. Much of educational practice is geared towards helping students understand relatively complex, abstract ideas in terms they can understand: making it simpler and putting it in context. We also need to help students take their understandings and express them in ways which are more complex and abstract, more academic. When students are trying to code robots to perform particular tasks they need to be able to move between the abstract and complex and the simple and concrete. Making this process explicit can help students understand what they need to do, and helps teachers understand what to do to scaffold students’ understandings.

Semantics is a dimension within Legitimation Code Theory (Maton, 2014), and looks at the relative level of contextulaization (semantic gravity) and condensation (semantic density) of knowledge. Knowledge can be viewed as either highly de-contextualized and abstract (Weak Semantic Gravity SG-) or as strongly contextulaized and concrete (Strong Semantic Gravity SG+). It can also be seen as highly complex, with meaning heavily condensed (Strong Semantic Density SD+) or as simple (Weak Semantic Density SD-). By tracking this movement between abstract and complex (SG- SD+) and concrete and simple (SG+ SD-) we can see how meaning is changing over time within a classroom. This can help teachers see when they need to help students either strengthen or weaken their semantic gravity or density.

Commonly meaning either remains at a fairly abstract/complex level – the high semantic flatline (A), or tends to remain at a low semantic flatline (B). What we  want to see  is  a much wider semantic range,with movement up and down the  semantic axis (C).The ability to link  theory and practice is what is being aimed at.

When choosing which platform to use for teaching robotics, there were a number of considerations. The financial cost of introducing robotics into our grade 8 and 9 year groups was one consideration, but the decision to plump for physical computing was partly driven by the need to strengthen and weaken semantic gravity. Computing is often seen by students as being highly abstract and complex. Particularly they struggle with transferring programming solutions from one context to another, in other words, they struggle with applying programming concepts to different contexts, or the weakening of semantic gravity. This presupposes the ability to understand those programming concepts, requiring the strengthening of semantic gravity. The decision to frame coding within a robotics context was taken because we felt it would help strengthen semantic gravity and increase semantic range by allowing students to “see” the results of their code in more tangible ways. We felt it would help if students could test their code in a very practical way and that it would help them understand programming principles better if they could see the results of their coding.

We decided to use the BBC micro:bit chip as a platform for robotics because it has a huge amount of resources available and an online programming platform using both text-based and block-based coding, giving us options for strengthening and weakening semantic gravity and density. The online coding platform allows students to use block-based, or text-based coding simply by toggling between Blocks and JavaScript. It also has a visualizer which displays the results of the program on the chip. The program below rolls a six-sided die. Students can test the program online by clicking Button A on the visualizer. They can then download the program onto the actual chip and test it.

In terms of robotics, students program the chip, which can then be inserted into a robot to drive it.

An AlphaBot2 with a BBC micro:bit chip

 

In designing the syllabus for robotics in grade 8 & 9 we were also concerned with creating opportunities to strengthen and weaken semantic gravity and density. In the semantic profiles shown above, semantic gravity and density were tracked in unison (between abstract/complex and concrete/simple). But Legitimation Code Theory offers a more nuanced picture of semantics. If we set out the axes of semantic gravity and density on a cartesian plane, as shown in the diagram, each quadrant represents a semantic code, as follows:

The Rhizomatic Code: Meaning here is abstract (SG-) and complex (SD+). This is the world of abstract, complex theorizing. In many ways this is where students need to operate when coding a more complex program. They need to be able to decide which variables or functions to use in their code, whether to use “for loops” or nested loops. Decisions are largely abstract and complex.

The Rarefied Code: Meaning is abstract (SG-), but simple (SD-). This quadrant is where the concepts used may be fairly abstract, but are simple. So, for example a single variable is used, rather than a variable inside a function call.

The Prosaic Code: Meaning is concrete (SG+) and simple (SD-). In terms of programming, instructions may be straight-forward and operational, such as move forward 5 seconds.

The Worldly Code: Here meaning is concrete (SG+) but complex (SD+). In other words, although practical, tasks are complex. A great deal of professional programming takes place at this level.

By the way, the word code here refers to the rules which legitimate practice rather than to computer programming. In a rhizomatic code, theory is valued, and practice is not. In a Worldly code, on the other hand, practice is valued above theory.

What becomes apparent from this is that teachers need to lead students on journeys between these codes to help make programming accessible. In other words semantic waves need to be created so that abstract and complex problems can be broken down into do-able, more concrete or simple tasks, and then reassembled into larger projects. To manage this a series of tasks were created, designed to introduce several programming concepts such as loops, variables or functions. Tasks were also organised in complexity. A basic pattern was to have a concrete, simple task (prosaic code), followed by introducing an abstraction (eg. a loop), but keeping the task simple (rarefied code), then introducing more complexity (worldly code) such as setting distance or speed though a variable. Concluding tasks would put together more than one principle, together with complexity (rhizomatic code).

Here are some examples of each code in the unit on robotics. Each of these solutions makes the robot move in a square. All of these solutions are technically correct if we see the problem as making a  robot move in a square,  but some solutions are more concise (condensed) or applicable across multiple contexts (decontextualized).

 

Rarefied Code

Rhizomatic Code

Prosaic Code

Worldly Code

Whereas in the Prosaic code quadrant, the task has been completed using forward and turn moves alone, abstraction has been introduced in the rarefied code by controlling the distance travelled by using variables for distance and speed. This allows a change in the values assigned to the variables to alter the size of the square – re-contextulaizing the problem. In the worldly code, we have condensed the movements within a repeat loop. Semantic density has been increased. In the rhizomatic code, a function has been created using the repeat loop (increased condensation) and variables (increased abstraction). Semantic Density has been strengthened, and Semantic Gravity weakened.

Students had been introduced to the use of loops, variables and functions in earlier tasks, but were given the prosaic code shown in the table above as a starter code, and asked to adjust speed and time to make the robot move in a square formation by testing it on the robots, further strengthening semantic gravity. They were then asked to try to use a loop, a variable, and a function to improve on the code. What became evident in the tasks submitted was that some students were able to incorporate variables or loops, and a few were able to incorporate functions, but only a minority could accomplish all three. Some students got stuck at various points in the prosaic, rarefied or worldly code. In a busy and productive classroom, students were encouraged to ask for help, and a “Live Code” session was held demonstrating the use of variables, loops and functions together to make a different shape.

What was plain to me was the need to find pedagogical approaches to help strengthen Semantic Density or weaken Semantic Gravity in a more deliberate fashion. Reading code and tracing through it to see what it does, testing it out, really helps to strengthen Semantic Gravity, but complexifying and abstracting out is far trickier to achieve. Live Code, in which the teacher models solutions and thought processes takes some of the class all the way, but leaves some behind. Mopping up the rest, one-on-one is a bit hit and miss with a large class.

Bibliography

Maton, K. (2014). Knowledge and Knowers: Towards a realist sociology of education. London, UK: Routledge/Taylor & Francis Group.

 

Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Thinking Digitally Logo

What makes for good thinking? It is commonly assumed that it is our intelligence or abilities that have the greatest effect on our thinking. Ron Ritchhart (2002), however, makes the case that it is our dispositions, our acquired patterns of behaviour which activate and direct our abilities . More than our abilities themselves, our dispositions afford our capacity for greatness. These dispositions are shaped and formed by our interactions with others, by our culture, society and history (Ritchhart & Perkins, 1997). Dispositions, or habits of mind (Dewey, 2007) are pivotal because they determine when we choose to think, the importance we place on it, and how deeply we question the assumptions and narratives (Bruner, 1991) upon which thought is based, allowing new narratives to emerge.

For example, the habit of mind persisting suggests a disposition towards keeping at it: trying new ways to approach a problem, not being daunted by failure, worrying away at something until you succeed. This disposition applies in many different contexts, including digital spheres, such as computer programming. It is clearly a disposition which affords success. You cannot succeed if you give up too easily!

homLikewise the capacity for finding humour affords success in many different contexts. Costa & Kallick (2008) have identified sixteen such habits of successful thinkers. These habits do not describe the knowledge or skills necessary for success, but the dispositions which enable successful thought and action in the world.

In this article I will be arguing that Thinking Digitally is just such a disposition; a habit of mind which predisposes those who demonstrate it towards success in the same way that persistence or finding humour does. It is a disposition towards using digital tools to aid thought and action in the world; towards deploying algorithmic thinking as a way of generating explanations of reality, and using computational thinking as an approach to problem solving. These ideas will be unpacked below, but it is important to note that they share a concern with the disposition towards using digital devices to create, solve problems and understand the world we live in. I have chosen to use the term Thinking Digitally rather than computational thinking, which is the more commonly used term, because I believe that it describes a wider range of uses of digital devices, beyond how we organise tasks that are going to be automated by machines, mathematical modelling, the use of simulations, and simulators, or calculators and computers to include the ways in which we access and evaluate information, and how we author digital texts and communicate our understandings in the current era.

I will argue that Thinking Digitally is a logically separate habit or disposition rather than the application of any combination of the other sixteen habits in a new context, the digital world. Marc Prensky (2001b) has advanced the idea that people born after a certain date, whom he has dubbed digital natives, carry a natural disposition towards digital technologies. He argues that people born before the advent of the digital era are digital immigrants; that digital technologies are somehow foreign to them, and that while digital natives are able to use digital devices effortlessly, for immigrants there is no such natural facility. In other words digital dispositions are based on age. While this idea has gained widespread traction because it seems to explain why some people seem to take to digital technologies like a duck to water, and others struggle with it, or avoid it entirely, it is a notion that has been roundly critiqued, as we shall see below. I shall argue for the notion of a set of digital dispositions, based on certain characteristics or habits of mind rather than on age, carries weight.

In recent years, within the Computer Science community, teachers have begun to argue that Computational Thinking, often called Algorithmic Thinking should be integrated into the curriculum, not just as a separate discipline, but as a foundational subject. I will argue that Computational Thinking and Algorithmic Thinking forms a new mode of thought, different from the paradigmatic or narrative modes of thought, and that the predisposition towards using this mode of thought forms the basis for arguing in favour of a separate habit of mind. I will argue that a disposition towards using computational and algorithmic thought, and allied to this using digital media to access and legitimate knowledge, forms part of an evolving revolution in thought, which is transforming the world, and towards which some people appear to be more disposed, and others less disposed. Acquiring the habit of digital thinking clearly advantages some, and leaves others floundering in an increasingly digital world.

If we look at recent ideas around epistemology, theories about learning and cognition, pedagogy and literacy, we can begin to discern polar opposites which, while they do not define neat categorizations of analogue and digital thought, do point to ways in which certain polarities are afforded by either analogue or digital modalities. We are living in an Information Age (Castells, 2011), in which the digital storage and processing of information in digital forms is increasingly central to our knowledge and economic survival, and this privileges certain ways of looking at and thinking about the world. Our ability to succeed in this altered landscape is largely dependent upon our dispositions and habits of mind, not just on our knowledge or skills. Some of these dispositions have already been described by Costa and Kallick, but some are unique to digital contexts, and revolve around how we think about our relationship with digital cognitive technologies.

Khaneman (2011) has argued that there are two types of thinking which operate. System I thought is fast, intuitive thinking in which we come to rapid assessments based on our accumulated habits of thought, and received judgements handed down to us. System II thought, on the other hand, is slow thinking in which we carefully reason through a position, considering the evidence and reaching a judgement based on a logical process. We need to rely on System I thought most of the time. We cannot tell life to wait while we carefully reason over a position. System II thought is, however, vital because if allows us to build better instincts, and base our intuitive responses on more solid ground. In other words, it frames our habits and dispositions. Successful critical thinking is based on building solid habits of mind on which we can depend when we don’t have the time to carefully consider our responses.

In much the same way, I want to make the case that the salient feature of our use of digital technologies lies not in the skills and abilities we bring to any given task, but in the dispositions and habits of mind that unlock what we are able to accomplish with technology. Behind digital literacy lies, in other words, a set of mental habits and dispositions which are what allow the power of the technology to be unleashed. My capacity to use a database is one thing, but it is my digital dispositions which determine when I choose to use a database, for what purposes, and to what effect. Their dispositions are what allow some people to use technology to unlock their creativity and solve problems rapidly and effectively, and what constrains others to use it merely for repetitive or largely meaningless tasks, or not to use it at all.

What are the dispositions and habits of mind of successful people when they are using digital devices and media? Do these dispositions form a logically discrete entity or are digital technologies simply a different context in which the habits and dispositions (Costa & Kallick, 2008) that apply to analogue thinking are applied?

In working through this argument I am going to make a number of moves which are intended to present the case that there are indeed a set of uniquely digital dispositions, and that it makes sense to speak about thinking digitally as an overarching approach that successful thinkers bring to their use of technology, which shapes and guides effective digital practice. Indeed I will argue that these digital dispositions may also be brought to analogue contexts. In other words, we human beings, in learning how to think effectively when using digital tools, may have acquired some dispositions which we can carry back into the real, analogue world.

Computational Thinking

computational-thinking-white-bg

The first move is to argue that one set of digital dispositions hinges around the thinking that we do when we interface with machines. A frequently used term is Computational Thinking (Wing, 2006), which is used to describe the “thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information processing agent” (Lee, 2016). It is essentially taken to mean, then, the thought processes of one who is using a machine to solve a problem, and it is normally constituted as being based on three cognitive pillars: abstraction, automation and analysis. When we use cognitive technologies to automate or perform tasks, we need to program the machine, we need to break the task down into sub-routines that can then be automated, and we need to think the task through, understanding it from the point of view of what it is that the machine can and can’t do. The illustration above (“Reading A Book or An Hour of Code?,” 2016) is typical of attempts to break down what constitutes computational thought.

Proponents of Computational Thinking in the curriculum argue that “algorithmic thinking” should form the fourth R of the curriculum: Reading, ‘Riting, ‘Rithmatic & ‘Rithm (Grover & Pea, 2013). The two terms appear to be used interchangeably. What then is Algorithmic Thinking?

Algorithmic Thinking vs Narrative & Paradigmatic Thought

paradigmatic narrative

Bruner (1991) sets out a dichotomy between two modalities of thought: the narrative and the paradigmatic. The paradigmatic mode, the scientific mode, seeks to logically categorize the world and test hypotheses of what is. The narrative mode, on the other hand, is concerned with the meaning that is ascribed to lived experience. When explaining the world, why an apple falls to the ground, for example, one could use paradigmatic thinking, appealing to a concept, the theory of gravity. But one could also use the narrative mode to explain why an apple falls to the ground: the apple falls because I knocked it with my elbow, and it rolled off the table and fell to the ground. There is nothing necessarily superior in either explanation. Both have explanatory power.

The advent of complex calculating machines, however, has furnished us with new ways to go about constructing an explanation of the world. Computer modelling offers ways of constructing a model of how tables, elbows and apples might behave under different sets of circumstances. Using a computer, and inputting slightly different sets of data and observing the outcomes can generate a more complete understanding of how apples roll and fall and allow us to make more accurate predictions. The sheer number of calculations necessary to sustain this mode of thought was largely impossible before the advent of computers, but it allows us to predict weather, and find distant planets.

paradigmatic narrative algorithmic

Those who have advanced the idea of an algorithmic mode of thought would be hard pressed as to where to place this in Bruner’s schema. Algorithmic thinking is not necessarily about categorizing the world, or logically predicting outcomes, nor is it about narrating an explanation, although it may involve both these things. It shares with paradigmatic thought a concern with establishing procedures, but shares with narrative thought a concern with sequencing. It should probably then been seen as a separate mode of thought. In the diagram on the right, I have represented this by placing it off to one side.

But, as we noted, there are considerable overlaps, so it would probably be better to represent this schema as a series of overlapping circles. This schema has the advantage of allowing for greater nuances. Bruner himself, for example shows how narrative thought lies behind much of scientific explanation.

paradigmatic narrative algorithmic 2

Behind the algorithmic mode of thought lies mathematical modelling of the world and the methodology of simulation. Mathematical modelling, enabled by the power of computers represents powerful new ways of acquiring knowledge of the world. Evolutionary Engineering, for example seeks to use comparisons between random designs to reach more efficient designs in the fewest number of steps. Rather than seek to design complex solutions from principle (paradigmatic view), by comparing one design to another, and selecting the better one, repeating this process, optimum designs can be achieved in relatively shorter periods of time.

Cognitive Offloading & Cognitive Technologies

Because we have such small working memories (Sweller, 1988) we constantly need to offload our cognition onto the environment. We use language and writing to achieve most of this, but we also use our fingers when counting, or devices such as calculators, calendars, books, and increasingly these days digital devices such as cell phones, computers or the Internet. Dror (2007) argues that digital tools have become so pervasive and integral to our thinking processes that they cease to be external aids and become internal components in our cognitive processes. Cognitive technologies, in other words, have become instrumental in shaping our cognitive processes. New forms of distributed cognition (Clark & Chalmers, 2010) have arisen, in which our thinking is distributed across machines and across other people (Dror & Harnad, 2008), radically altering our cognitive landscapes.

These new theories about cognition and knowledge, encompassed within what is known as the Second Cognitive Revolution in psychology (Miller, 2003) clearly represent radical new ways of looking at what it means to be human, and how we see the relationship between man and machine.

Computational Thinking & Digital Thinking

Jeanette Wing, who introduced the term Computational Thinking, described it as “a universally applicable attitude and skill set everyone, not just computer scientists, would be eager to learn and use” (Wing, 2006). Nevertheless, and despite a growing receptivity to the idea that Computer Science should be considered a core subject in the curriculum (Grover & Pea, 2013), and despite the growing integration of ICTs in our lives, and educational system, Computational Thinking is still seen largely as part of the Computer Science curriculum rather than a mode, or disposition of thought which pervades all subject disciplines. Lee argues that there is a tendency to conflate Computational Thinking with the operational steps involved, with computer literacy or with information literacy (Lee, 2016). This misses the point that Computational Thinking is in essence the dispositions of thought that human beings use when they are using machines to automate thinking process, or doing cognitive offloading onto machines.

I think part of the difficulty lies in the use of the term computational. As Wing stresses, the aim of Computational Thinking is not to get human beings to think like computers, but to describe how human beings think when they are using machines to solve problems. The term computational is so firmly rooted in the mathematical and engineering sciences that it tends to preclude the use of computers for perhaps their most vital task, that of communicating and accessing knowledge. How knowledge is legitimated and disseminated lies at the centre of our cognitive identities, and the new digital technologies are in the process of transforming the rules of the game. I believe that it is more useful to speak of how human beings organise their thought processes around problem solving using machines as digital thinking rather than computational thinking. This semantic change allows us to conceptualise computers as communication as well as computational devices and to see the dispositions of thought as including both purposes.

Digital technologies have transformed our practices in any number of ways, but it is important to note that every shift in practice represents a mental shift as well. It changes how we approach problem solving in subtle but important ways.

Digital Thinking vs Analogue Thinking

The second move is to claim that human thought is undergoing something of a revolution, and that particular aspects of our cognition are being accentuated and transformed by the rapid advances in digital media and what this affords. These changes are sufficiently momentous, and the contours of such change sufficiently discernible that we can begin to talk about a difference between digital and analogue thinking. These differences are not absolute; we have not acquired new brains overnight, but they are polarities which are impacting on our cognitive lives. I believe that those aspects of our thought which are afforded by digital technologies have been growing in importance, in line with the development of the information economy and the new world order, which has altered our discursive landscape (Gee & Lankshear, 2006) and may well be altering our brains as well. Neuroplasticity ensures that changes in practice have an impact on our brains (Prensky, 2001b), and it must be assumed, our cognition also.

analogue v digital

That we are experiencing change is not in contention, but how to characterise that change is. There are perhaps two possible views on this. The first, what we might term the hard view, is that Digital Thinking represents a new mode of thought afforded by the new digital media. This modality of thought has significant points of difference to Analogue Thinking to warrant identifying a range of habits and dispositions of mind which signal successful practice. This view sees cognition as being revolutionised. Stevan Harnad (1991), for example, talks of the Fourth Cognitive Revolution, as described below, and this has obvious ramifications for our behaviour and habits. The soft view, on the other hand, sees cognition as being largely unaltered by digital media, but that new approaches and dispositions are needed for dealing with the differences inherent in using digital media. The illustration above attempts to characterise two polarities of analogue and digital thinking as cutting across the different modalities of thought we discussed previously (Paradigmatic, Narrative and Algorithmic thought).

I do not believe that we can assess either of these views, whether cognition itself is changing, or whether our habits of mind are changing does not essentially alter the schema which can be discerned emerging from the literature around digital literacies, cognitive psychology and computational thinking. In either event it makes sense to differentiate the contours of how we perceive the terrain as altered by the introduction of digital technologies.

In an always-on, always-connected world, as more and more devices become connected in the Internet of things, and as connectivity becomes more commonplace and more integral to our lives, many are asking questions about what this means for our cognitive development. Headlines scream that the Internet is making us dumb (Carr, 2011; Thompson, 2013) and neuroscientsists suggest that our brains are being re-wired (Siemens, 2014; Greenfield, 2014) for good or ill. Psychologists suggest that it is affecting our memories (Sparrow, Liu, & Wegner, 2011) or that it is affecting our social lives (Wooley, 2013). There is a sense in which we stand at a crossroads, uncertain of both the direction and meaning of the rapid changes which are shaping our society. Utopian dreams and dystopian fears permeate the debate around the extent to which technology is driving change. Advances in Artificial Intelligence raise the prospect that we will soon not be the only sentient beings on the planet, and usher in a Brave New World where the difference between Human and Machine may not be that clear cut.

I would argue that never before have we needed a roadmap for navigating the future as much as we do now. Moore’s Law (Gibbs, 2015) which states that the capacity for data storage will double every 18 months, while applying to data, is a metaphor for the rapidity of change, and expresses the sense in which the rate of change is exponential. I would argue that what is required is a new set of habits and dispositions, strategies for dealing with this change, and with the changes associated with the Internet revolution. Our old ways of thinking, the habits and dispositions of centuries may not be enough to guide us in the future.

The advent of the printing press saw the introduction of a medium of mass communication which has had far-reaching implications for our history, society, and thought. When books were only available to a select few, and knowledge dependent upon individual transmission, knowledge was hierarchical and legitimated by an appeal to authority alone. In the thirteenth century Peter Abelard initiated a revolution in knowledge (Campbell, 1991), an appeal to evidentiary support in challenge to authority, where even theology had to bow before reason, but it was only with the invention of the printing press, and the Industrial Revolution allowing for the mass production of books that knowledge could be distributed more widely, ushering in the ideological sea change represented by the Enlightenment. How knowledge is legitimated, disseminated and reproduced changes completely how individuals think, learn, create and solve problems.

I would argue that we are at a similar turning point represented by the new digital media. The ability to self-publish, afforded by the Internet, has changed how knowledge is disseminated, altering legitimation practices. Many academics publish material from their blogs, and the speed of publication ensures peer review, but not in the conventional sense. An example of the future is wikipedia, that repository of crowdsourced knowledge said to be as accurate as Encyclopedia Britannica. Stevan Harnad’s notion of academic “sky-writing” encapsulates the idea of the Internet affording the immediacy of oracy with the reflective power of literacy to bring about the Fourth Cognitive Revolution. Harnad’s (1991) schema is tabulated below.

First Cognitive Revolution Oracy Hundreds of thousands of years ago Communication immediate and interactive Growth of Wernicke’s area and Broca’s area in brain?
Second Cognitive Revolution Literacy Tens of thousands of years ago Allows for reflection between communications Growth of Exner’s area in brain?
Third Cognitive Revolution Printing Press c. 1450 AD Revolution of scale
Fourth Cognitive Revolution Internet c. 1990s AD Near synchronous communication: brings together immediacy and reflection Are our brains being rewired?

This periodisation carries with it a sense of cognitive affordance offered by the new digital media of near synchronous communication. Ideas can be disseminated rapidly, enabling quick responses and affording collaborative thinking, but with enough time for some reflection before responses are made. The last column, linked to possible neural developments is highly speculative, and Harnad himself makes only tentative suggestions in this regard, but feeds into a contemporary literature suggesting that the move from page to screen carries with it some neural rewiring.

News and opinions are shared rapidly across social media, replacing traditional news sources. In a world where breaking news is not mediated by the professional journalistic eye, but blogged, tweeted and facebooked by participants, we need to reassess how we access knowledge about the world. Content is similarly mediated by new mechanisms. In a world where opinions are discussed through the vehicle of the share, the like and the Internet meme, we need to see knowledge as a commodity that is rapidly traded across the globe in relatively unmediated ways.

I think we can begin to discern the basis for an argument that we are seeing a revolution in cognition, a shift away from analogue, towards digital modes of communication and cognition.

Digital media do not replace analogue channels. As a species we still rely on our senses. Nevertheless, digital media represented by Internet and other telecommunication technologies, represent new modes of communication which have far-reaching consequences for how we communicate, think, learn, create and solve problems.

How our thinking is mediated, and the types of cognitive offloading that are afforded by these technologies may be represented by the following chart which compares Digital and Analogue Contexts:

Thinking Digitally Analogue Thinking Discussion
Hyperspace The five senses Push technologies in particular afford the delivery of information from sources not limited to our biological five senses.

 

For example I learn what the weather is outside from my cell phone as opposed to looking out the window.

Fast paced Slow paced Acquiring data in an analogue world is slow and deliberate, gathering data in a digital world is almost immediate and less considered. Speed has advantages and disadvantages and calls for different approaches to evaluating data.

 

I can use a search engine to find out about something rather than going to a public library and finding a book on the subject.

Multi-channel (digital media streams) Single channel (biological senses) While data in the analogue world comes at us via the multiple channels of our senses, the digital world opens up even more channels, some of them referencing analogue channels, others entirely in cyberspace. This has advantages and disadvantages and calls for different approaches towards data collection.

As I type this, I am receiving data from my cell phone, emails and alerts are appearing in my inbox and I am able to use the Internet to search for information. This is on top of what I sense about the world around me through my biological senses, which tends towards a single channel (my senses).

Parallel Processing Serial Processing The speed at which different channels of communication can be accessed affords multi-tasking. Of course, given the severe limitations imposed on our working memory, we can never truly multi-task, we can certainly take on tasks simultaneously in the analogue world. However, the plethora of extra channels extends our ability to work on many levels at the same time.

I tend to take on more tasks in parallel while using digital media. Partly because it is more efficient to do so, but partly because my mind is parcelling out sub-tasks which will require time because they are asynchronous communications – for example posting a query for help in solving a problem on a forum. While I wait for a reply, I write an email and watch a Kitty video on YouTube.

Hypertextual Screen Reading Linear/narrative Page Reading Readings in hyperspace are hypertextual rather than linear or narrative. Hypertextual reading affords rapid synthesis and evaluation. Linear, narrative readings afford a closely argued train of thought.

Using a website I can rapidly skim for relevant information by following links and using the Find or search function. If I want to follow a train of thought I have to read sequentially.

Fast and Wide Thought Slow and Deep Thought Multi-tasking and hypertextual reading affords the rapid synthesis and evaluation of ideas, the identification of patterns and Fast Thought. Linear and narrative readings affords closely reasoned consideration of a train of thought.

Most decisions we make are the result of Fast Thought, but digital media afford this by making more information available and allowing us to find patterns more actively. Page reading, however, affords deep thought because I have to follow the sequence of words in the order they were written.

The Reader Creates The Text The Writer Creates The Text Hypertext is epistemologically predisposed towards a constructivist paradigm where the reader constructs their own meaning from the world around them. Texts are created by the reader, following links.

Text is predicated on an author communicating and transferring a message, and is epistemologically neutral. The text is created by the writer.

Constructivist Pedagogical Paradigm Instructivist Pedagogical Paradigm Theories about how we think and learn have changed considerably over the last hundred years or so, with a shift towards theories stressing how meaning is constructed rather than perceived.
Learning by participation Learning by acquisition Sfard (1998) has argued that there are two metaphors for learning and both are valid – learning by participation, and learning as acquisition.
Symbolic Manipulation Symbolic Creation Fast thought affords rapid symbol manipulation, while slow thought affords the creation of new symbols.

Digital media allow us to rapidly manipulate information in many forms, but page reading allows us to generate new concepts.

Computational (Digital)

(Algorithmic) Thinking

Narrative & Metaphoric Thinking Jerome Bruner has shown the importance of narrative in our cognition. If we think about an issue we tend to do so by relating it. We also think metaphorically, comparing things to something similar in our experience. Computational thinking is something different – it operates at the level of simulation and modelling.

Digital media allow us to create simulations by modelling a problem and playing it out rather than thinking through a solution by relating its contours. This produces unexpected and counterintuitive results.

This schema is presented as polarities of what must inevitably form a continuum. Nor do I believe that they form a rigid dichotomy. Far more accurate to say that there is a dialectical relationship between the two.

The Foundations Of Digital Thinking

This characterization of twin polarities, operating in a dialectical (dialogic) relationship rather than in opposition, allows us to argue that there are several elements in human thought which, if not new, are at least being emphasised and foregrounded by the growing centrality of computers and ICTs in our lives. If we accept that even more important than our abilities and capacities are our dispositions, our ingrained behaviours and responses to the problems and challenges we face, it becomes clear that how we react to this shift in technology from analogue to digital is crucial in what we can accomplish.

What elements then constitute the dispositions of successful users of computational and digital media? If we look at successful use of digital media I think we can start to identify a number of elements.

Embracing Change

We all know that some people seem to embrace change and use it to their advantage, while others appear phased by change and shy away from it, or appear less imaginative and productive in the ways in which they use it. This has been falsely framed as a generational divide (Prensky, 2001a); that those born after 1985 (digital natives) are somehow wired differently, and have a natural affinity with ICTs which older people (digital immigrants) lack. While our brains may well be wired differently, if I am following this debate correctly, neural plasticity implies that all users of new technologies, regardless of age, will experience changes. There is a great deal of research indicating that this notion of a differently wired, differently thinking generation is a myth (Kirschner & van Merriënboer, 2013). There is no evidence to suggest, for example, that a new generation of youngsters are predisposed, somehow, to self-directed, self-paced, collaborative learning because their brains are somehow wired this way.

The divide is really one of disposition. Some youngsters are “immigrants” as much as some older people, and some older people are completely at home with technology. Exposure to, and comfort with a technology does not necessarily imply a capacity to use the technology in the most beneficial ways, in and of itself. For example, technology may afford self-directed learning, but it is one’s dispositions which determine whether or not these opportunities will be taken up. These dispositions generally need to be taught, and herein lies the central paradox of education. Discovery learning, as attractive as it sounds is ultimately a deeply flawed idea because of the scholar’s dilemma; how can you discover something until you know it is there to be discovered?

For example. Our brains, constrained by cognitive load, limit the ability to multi-task (Kirschner & van Merriënboer, 2013; Sweller, 1988). Our dispositions and habits, however, may allow us to handle multiple tasks with greater facility, overcoming the limitations of our biology to some extent.

Successful thinkers tend to embrace change rather than balk at it. Thinking Digitally means that you are constantly ready to adapt to change, and see it as an opportunity rather than a constraint.

Production vs Consumption

They say a picture is worth a thousand words. There is all the difference in the world between creating and editing pictures and viewing them. It is not, however, that the one is good and the other bad. There are skills and sensibilities, dispositions and capacities which are involved in both activities. To read a picture effectively requires a thorough understanding of the genres and contexts in which the picture was taken. Producing a picture also requires skills and abilities, understandings and literacies. I would argue then that we cannot privilege authorship above criticism. Both have value and a place. We are all at some stage or another a producer or a consumer of both analogue and digital technologies and products.

Nevertheless, there are good habits and dispositions around production, and likewise for consumption. These dispositions are likely to be very similar in analogue and digital contexts. Having said this, however, we need to highlight one vital difference.

All technologies require knowledge and skills both around production and consumption. But some technologies are clearly more complex than others and require specialised skills and knowledge. While almost everybody learns to read and write, very few learn to write software code. While most people can use a screwdriver, relatively fewer can use a soldering iron. This complexity factor limits and constrains the use of all technology. But digital technology represents a level of complexity which makes this division between people who can and people who can’t more stark. It is relatively rare to hear someone say of any analogue technology that it is simply beyond them and they cannot use it in any way. And yet this attitude towards digital technology is frequently heard.

A person who is relatively handy can tinker with, and fix a range of analogue technologies without ever being an expert in any of them. And yet, increasingly this tinkering is not possible with digital technologies as the technology requires such specialised skills and knowledge that the ordinary “handyman” cannot fix a problem.

I am no motor mechanic, but even I have been able to get a Volkswagen Beetle motor running again with a little logic, a prayer, and a spanner. There is no way that I could do the same with modern cars with their electronic complexity and sealed units. The growing complexity of our machines threatens to turn us all into helpless consumers, and robs us of any ability to tweak and tinker which an essential part of being an effective producer.

Personal computers were first introduced very much within a tinkering culture; the first computers were shipped with no applications – users had to create their own programs. These days the reverse is true – computers and devices are loaded with a wealth of applications, and programming is both unnecessary and discouraged. Production within each application is further consumerized by a move towards authorship using templates and wizards. Computing has moved from a strong producer ethos to one which encourages consumption. The rapid growth in computer usage explains this shift, but I would argue that as the user base expands so does the need to actively encourage producer dispositions, which are being lost by the ease of consumption.

Growing social complexity and knowledge specialization means that increasingly we are reliant on others to produce and fix, what formerly we could tinker with ourselves. This leaves us often helpless. While knowledge and skills specialization accounts for some of this movement from producer to consumer cultures, much of it is driven by underlying habits and dispositions. We can see in the Maker Movement (“Maker culture – Wikipedia, the free encyclopedia,” 2016) and in the thinking behind Computational Thinking, a growing reaction to this trend. As analogue technologies become increasingly digitized, I would argue that the need to foster producer mentalities and dispositions should be seen as a core aspect of Digital Thinking.

Successful thinkers display dispositions towards producer mentalities. Thinking Digitally means that you are able to Hack Your Life.

Creativity vs Drudgery

We face a future in which Artificial Intelligence will transform all aspects of our economy and society (Cellan-Jones, 2014). While we do not know what the future holds in store, I would argue that we need to see the growing technologization of society as an opportunity, and need to emphasise a disposition towards creativity rather than drudgery. Technology may be used to free up our lives for leisure and creativity, or for drudgery and work. Andrew Feenberg (1991) described the central contradictions between the potentialities for control and democratization inherent in technology, and pivotal to speculation over the role of technology has been dystopian and utopian visions of the future linked largely to these polarities.

Do we face a future in which a few use technology to control and pacify the many, in which humanity at large is debased by a digital divide in which they are passed over, or will technology free the majority from lives of drudgery to unlock their full potential? This is very much a political choice, and depends upon our social organization. The sharp debates around the politics of Austerity reveal the contours of the choices we have to make within the new world order, whether to accept Neo-Liberal Taylorist arguments about the nature of Capital, or to seek Socialist alternatives and a more equal distribution of wealth. The stark choice is whether we valorize capital above labour, or see technology as an opportunity to unlock human potential in ways not previously imaginable.

We urgently need a disposition towards creativity over drudgery in the ways in which we approach our use of technology.

Jane McGonigal (2011) has written about what she calls the four superpowers of games, and how these powers may make a better world. These superpowers form perhaps the dispositions of successful gamers. McGonigal argues they are the crucial dispositions for life itself. They are:

  1. Urgent Optimism
  2. Blissful productivity
  3. Social Fabric
  4. Epic Meaning

Her characterization of gaming as empowering in gamers a hopeful expectation of success in conjunction with productive work within a social network to accomplish meaningful tasks expresses a disposition towards optimistic, humanistic outcomes which resonates with a view of human purpose which is vastly at odds with the Malthusian vision of humanity as mere cogs in a machine.

Successful thinkers display dispositions towards creativity and unlocking their human potential. This requires constantly finding new ways to express yourself through technology, rather than allowing it to be used to marginalise your humanity. Thinking Digitally means that you are alive to the human within the machine.

Conclusion

In this exploratory paper I have attempted to argue that the most crucial element of our emerging relationship with technology is the dispositions and habits of mind we bring to our use of machines to automate and extend our thinking. The digital divide is often conceptualised as a divide around access to devices, but in reality it is a mental divide between those whose dispositions empower human creativity and problem solving, and those whose habits of mind limit the realm of the possible.

Bibliography

Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Castells, M. (2011). The Rise of the Network Society: The Information Age: Economy, Society, and Culture. John Wiley & Sons.

Cellan-Jones, R. (2014). Stephen Hawking warns artificial intelligence could end mankind – BBC News. Retrieved April 8, 2016, from http://www.bbc.com/news/technology-30290540

Clark, A., & Chalmers, D. J. (2010). The Extended Mind. In The Extended Mind (pp. 26–42).

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Dror, I. E. (2007). Cognitive Technologies and the Pragmatics of Cognition. John Benjamins Publishing.

Dror, I. E., & Harnad, S. R. (2008). Cognition Distributed: How Cognitive Technology Extends Our Minds. John Benjamins Publishing.

Feenberg, A. (1991). Critical theory of technology. Oxford University Press, USA.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Kahneman, D. (2011). Thinking, Fast and Slow. Penguin UK.

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do Learners Really Know Best? Urban Legends in Education. Educational Psychologist, 48(3), 169–183.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Maker culture – Wikipedia, the free encyclopedia. (2016, March). Retrieved April 8, 2016, from https://en.wikipedia.org/wiki/Maker_culture

McGonigal, J. (2011). Reality is Broken: Why Games Make Us Better and How They Can Change the World. Random House.

Miller, G. A. (2003). The cognitive revolution: a historical perspective. Trends in Cognitive Sciences, 7(3), 141–144.

Prensky, M. (2001a). Digital Natives, Digital Immigrants: A New Way to Look at Ourselves and Our Kids.

Prensky, M. (2001b). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Reading A Book or An Hour of Code? (2016, March 22). Retrieved May 10, 2016, from https://lh3.googleusercontent.com/-bQWsEynX-8A/VgGCHw2AdrI/AAAAAAABWTU/_y9VVvBQYgM/w1728-h1728/computational-thinking-white-bg.jpg

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A. (1998). On Two Metaphors for Learning and the Dangers of Choosing Just One. Educational Researcher , 27(2), 4.

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

(Bruner 1991)Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Prensky, M. (2001). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A., & Anna, S. (2014). Metaphors for Learning. In Encyclopedia of Science Education (pp. 1–4).

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

 
Comments Off on Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Posted by on December 6, 2019 in Computational Thinking, Critical thinking, Habits Of Mind, Learning Theories, Pedagogy, Thinking Digitally, Thinking Skills

 

Teaching ICTs – Computational Thinking Pedagogies & Thinking Strategies

As more and more schools start teaching computing explicitly in one form or another to all students, the focus moves from advocacy, getting computing for all into schools, towards pedagogy – how best to teach it. There are a number of pedagogical approaches and cognitive strategies that have been promoted, but I would like to look at just one which I have found effective over the years. I think it would be fair to say that many teachers of ICTs are self-taught, and perhaps for this reason, the discovery method is still remarkably popular. The influence of Seymour Papert’s Constructionism, allied with Constructivist learning principles has made direct instruction far less prevalent in the computer class than in perhaps any other classroom. Nevertheless, a lack of direct instruction, I believe, threatens to undo many of the benefits of a discovery learning framework.

Discovery learning is predicated on setting problems which will allow students to learn through grappling with trying to find solutions, drawing on their past learning and knowledge. But there is the danger that no learning will take place at all if there is no scaffolding of the discovery process, and ironically, the greater the reliance on student-centered learning, the greater the need for directed teacher interventions. The greatest weakness of the discovery method is encapsulated in the scholar’s dilemma. How do you discover something that you don’t know exists? You need a more experienced other to, at the very least, nudge you in the right direction.

Teachers who use discovery learning, need to be careful to make sure that students have the tools they need to learn something from problem-based approaches. This scaffolding can take many forms, but without it, learning is a very hit and miss affair. Teachers need to adopt a range of strategies to scaffold the kind of knowledge that will feed into problem based learning activities. For example you can carefully guide students through base skills and knowledge needed, and then set more open-ended projects that build on this learning. Or you can set open-ended projects and make knowledge available where needed as students explore what they need to discover to solve the problems set. Alternatively you can pursue a mix of these approaches, giving some instruction up-front, and then supplementing knowledge where it is needed.

But another approach is also possible. If students are working in groups, you can use a jigsaw technique. Students can be split into expert groups and work on guided assignments so that they become experts in one aspect of the overall task. Groups are then formed by taking a member from each expert group so that each group has multiple experts in different aspects applicable to the task. They then teach each other what needs to be known to perform the task. For example, if a project uses spreadsheets, one expert can be trained in formatting cells, another in creating formulae, another in functions, one in graphing, and so on. When they combine they should be able to tackle a task which requires all these skill sets, and students should learn from each other.

An alternative approach is to have groups work on sub-tasks which combine to solve a larger problem. Each group works on a task which, combined, comes together to provide an elegant solution. For example one group might work on moving a robot, another on the operation of its crane, and so on. These approaches do, however, require different students receiving different instructional input. This can be achieved using a flipped classroom model. Even when I cover material in class, I make a video of the content and post it  on my Learning Management System so that it accessible to everyone, even if a student did not do that particular task at the time.

Having said this, managing different instruction for different students is a logistical problem. I prefer to have a wide range of tasks and either allow students to choose tasks they wish to pursue, or to focus tasks around building capacity for particular projects. An example would be the task below. By preparing task cards which set out the brief for any task, but also includes a hint about how to proceed, students can work at their own pace, or use the tasks to build up the skills needed to tackle larger projects. The task card may include very detailed steps to follow, or simply hints.

I have gamified the tasks, which is not necessary, but does, I believe, add a certain something. If you allocate Experience Points (XP) for completing tasks you can use the number of XP earned to unlock larger projects or challenges much in the way that XP allows characters to level-up in RPGs. This ensures that students complete as many tasks as possible, hopefully accumulating base skills needed for the challenges and projects. Because the hints and instructions are on the the reverse side of the task card, as a teacher you do not need to do a great deal of whole class instruction. Jigsaw groups, or sub-task groups can work relatively independently, typically after a whole class session when introducing a new application. If students are still stuck I am able to answer individual queries in class, and there are always the videos as back-up.

 
 
%d bloggers like this: