RSS

Category Archives: Habits Of Mind

Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Thinking Digitally Logo

What makes for good thinking? It is commonly assumed that it is our intelligence or abilities that have the greatest effect on our thinking. Ron Ritchhart (2002), however, makes the case that it is our dispositions, our acquired patterns of behaviour which activate and direct our abilities . More than our abilities themselves, our dispositions afford our capacity for greatness. These dispositions are shaped and formed by our interactions with others, by our culture, society and history (Ritchhart & Perkins, 1997). Dispositions, or habits of mind (Dewey, 2007) are pivotal because they determine when we choose to think, the importance we place on it, and how deeply we question the assumptions and narratives (Bruner, 1991) upon which thought is based, allowing new narratives to emerge.

For example, the habit of mind persisting suggests a disposition towards keeping at it: trying new ways to approach a problem, not being daunted by failure, worrying away at something until you succeed. This disposition applies in many different contexts, including digital spheres, such as computer programming. It is clearly a disposition which affords success. You cannot succeed if you give up too easily!

homLikewise the capacity for finding humour affords success in many different contexts. Costa & Kallick (2008) have identified sixteen such habits of successful thinkers. These habits do not describe the knowledge or skills necessary for success, but the dispositions which enable successful thought and action in the world.

In this article I will be arguing that Thinking Digitally is just such a disposition; a habit of mind which predisposes those who demonstrate it towards success in the same way that persistence or finding humour does. It is a disposition towards using digital tools to aid thought and action in the world; towards deploying algorithmic thinking as a way of generating explanations of reality, and using computational thinking as an approach to problem solving. These ideas will be unpacked below, but it is important to note that they share a concern with the disposition towards using digital devices to create, solve problems and understand the world we live in. I have chosen to use the term Thinking Digitally rather than computational thinking, which is the more commonly used term, because I believe that it describes a wider range of uses of digital devices, beyond how we organise tasks that are going to be automated by machines, mathematical modelling, the use of simulations, and simulators, or calculators and computers to include the ways in which we access and evaluate information, and how we author digital texts and communicate our understandings in the current era.

I will argue that Thinking Digitally is a logically separate habit or disposition rather than the application of any combination of the other sixteen habits in a new context, the digital world. Marc Prensky (2001b) has advanced the idea that people born after a certain date, whom he has dubbed digital natives, carry a natural disposition towards digital technologies. He argues that people born before the advent of the digital era are digital immigrants; that digital technologies are somehow foreign to them, and that while digital natives are able to use digital devices effortlessly, for immigrants there is no such natural facility. In other words digital dispositions are based on age. While this idea has gained widespread traction because it seems to explain why some people seem to take to digital technologies like a duck to water, and others struggle with it, or avoid it entirely, it is a notion that has been roundly critiqued, as we shall see below. I shall argue for the notion of a set of digital dispositions, based on certain characteristics or habits of mind rather than on age, carries weight.

In recent years, within the Computer Science community, teachers have begun to argue that Computational Thinking, often called Algorithmic Thinking should be integrated into the curriculum, not just as a separate discipline, but as a foundational subject. I will argue that Computational Thinking and Algorithmic Thinking forms a new mode of thought, different from the paradigmatic or narrative modes of thought, and that the predisposition towards using this mode of thought forms the basis for arguing in favour of a separate habit of mind. I will argue that a disposition towards using computational and algorithmic thought, and allied to this using digital media to access and legitimate knowledge, forms part of an evolving revolution in thought, which is transforming the world, and towards which some people appear to be more disposed, and others less disposed. Acquiring the habit of digital thinking clearly advantages some, and leaves others floundering in an increasingly digital world.

If we look at recent ideas around epistemology, theories about learning and cognition, pedagogy and literacy, we can begin to discern polar opposites which, while they do not define neat categorizations of analogue and digital thought, do point to ways in which certain polarities are afforded by either analogue or digital modalities. We are living in an Information Age (Castells, 2011), in which the digital storage and processing of information in digital forms is increasingly central to our knowledge and economic survival, and this privileges certain ways of looking at and thinking about the world. Our ability to succeed in this altered landscape is largely dependent upon our dispositions and habits of mind, not just on our knowledge or skills. Some of these dispositions have already been described by Costa and Kallick, but some are unique to digital contexts, and revolve around how we think about our relationship with digital cognitive technologies.

Khaneman (2011) has argued that there are two types of thinking which operate. System I thought is fast, intuitive thinking in which we come to rapid assessments based on our accumulated habits of thought, and received judgements handed down to us. System II thought, on the other hand, is slow thinking in which we carefully reason through a position, considering the evidence and reaching a judgement based on a logical process. We need to rely on System I thought most of the time. We cannot tell life to wait while we carefully reason over a position. System II thought is, however, vital because if allows us to build better instincts, and base our intuitive responses on more solid ground. In other words, it frames our habits and dispositions. Successful critical thinking is based on building solid habits of mind on which we can depend when we don’t have the time to carefully consider our responses.

In much the same way, I want to make the case that the salient feature of our use of digital technologies lies not in the skills and abilities we bring to any given task, but in the dispositions and habits of mind that unlock what we are able to accomplish with technology. Behind digital literacy lies, in other words, a set of mental habits and dispositions which are what allow the power of the technology to be unleashed. My capacity to use a database is one thing, but it is my digital dispositions which determine when I choose to use a database, for what purposes, and to what effect. Their dispositions are what allow some people to use technology to unlock their creativity and solve problems rapidly and effectively, and what constrains others to use it merely for repetitive or largely meaningless tasks, or not to use it at all.

What are the dispositions and habits of mind of successful people when they are using digital devices and media? Do these dispositions form a logically discrete entity or are digital technologies simply a different context in which the habits and dispositions (Costa & Kallick, 2008) that apply to analogue thinking are applied?

In working through this argument I am going to make a number of moves which are intended to present the case that there are indeed a set of uniquely digital dispositions, and that it makes sense to speak about thinking digitally as an overarching approach that successful thinkers bring to their use of technology, which shapes and guides effective digital practice. Indeed I will argue that these digital dispositions may also be brought to analogue contexts. In other words, we human beings, in learning how to think effectively when using digital tools, may have acquired some dispositions which we can carry back into the real, analogue world.

Computational Thinking

computational-thinking-white-bg

The first move is to argue that one set of digital dispositions hinges around the thinking that we do when we interface with machines. A frequently used term is Computational Thinking (Wing, 2006), which is used to describe the “thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information processing agent” (Lee, 2016). It is essentially taken to mean, then, the thought processes of one who is using a machine to solve a problem, and it is normally constituted as being based on three cognitive pillars: abstraction, automation and analysis. When we use cognitive technologies to automate or perform tasks, we need to program the machine, we need to break the task down into sub-routines that can then be automated, and we need to think the task through, understanding it from the point of view of what it is that the machine can and can’t do. The illustration above (“Reading A Book or An Hour of Code?,” 2016) is typical of attempts to break down what constitutes computational thought.

Proponents of Computational Thinking in the curriculum argue that “algorithmic thinking” should form the fourth R of the curriculum: Reading, ‘Riting, ‘Rithmatic & ‘Rithm (Grover & Pea, 2013). The two terms appear to be used interchangeably. What then is Algorithmic Thinking?

Algorithmic Thinking vs Narrative & Paradigmatic Thought

paradigmatic narrative

Bruner (1991) sets out a dichotomy between two modalities of thought: the narrative and the paradigmatic. The paradigmatic mode, the scientific mode, seeks to logically categorize the world and test hypotheses of what is. The narrative mode, on the other hand, is concerned with the meaning that is ascribed to lived experience. When explaining the world, why an apple falls to the ground, for example, one could use paradigmatic thinking, appealing to a concept, the theory of gravity. But one could also use the narrative mode to explain why an apple falls to the ground: the apple falls because I knocked it with my elbow, and it rolled off the table and fell to the ground. There is nothing necessarily superior in either explanation. Both have explanatory power.

The advent of complex calculating machines, however, has furnished us with new ways to go about constructing an explanation of the world. Computer modelling offers ways of constructing a model of how tables, elbows and apples might behave under different sets of circumstances. Using a computer, and inputting slightly different sets of data and observing the outcomes can generate a more complete understanding of how apples roll and fall and allow us to make more accurate predictions. The sheer number of calculations necessary to sustain this mode of thought was largely impossible before the advent of computers, but it allows us to predict weather, and find distant planets.

paradigmatic narrative algorithmic

Those who have advanced the idea of an algorithmic mode of thought would be hard pressed as to where to place this in Bruner’s schema. Algorithmic thinking is not necessarily about categorizing the world, or logically predicting outcomes, nor is it about narrating an explanation, although it may involve both these things. It shares with paradigmatic thought a concern with establishing procedures, but shares with narrative thought a concern with sequencing. It should probably then been seen as a separate mode of thought. In the diagram on the right, I have represented this by placing it off to one side.

But, as we noted, there are considerable overlaps, so it would probably be better to represent this schema as a series of overlapping circles. This schema has the advantage of allowing for greater nuances. Bruner himself, for example shows how narrative thought lies behind much of scientific explanation.

paradigmatic narrative algorithmic 2

Behind the algorithmic mode of thought lies mathematical modelling of the world and the methodology of simulation. Mathematical modelling, enabled by the power of computers represents powerful new ways of acquiring knowledge of the world. Evolutionary Engineering, for example seeks to use comparisons between random designs to reach more efficient designs in the fewest number of steps. Rather than seek to design complex solutions from principle (paradigmatic view), by comparing one design to another, and selecting the better one, repeating this process, optimum designs can be achieved in relatively shorter periods of time.

Cognitive Offloading & Cognitive Technologies

Because we have such small working memories (Sweller, 1988) we constantly need to offload our cognition onto the environment. We use language and writing to achieve most of this, but we also use our fingers when counting, or devices such as calculators, calendars, books, and increasingly these days digital devices such as cell phones, computers or the Internet. Dror (2007) argues that digital tools have become so pervasive and integral to our thinking processes that they cease to be external aids and become internal components in our cognitive processes. Cognitive technologies, in other words, have become instrumental in shaping our cognitive processes. New forms of distributed cognition (Clark & Chalmers, 2010) have arisen, in which our thinking is distributed across machines and across other people (Dror & Harnad, 2008), radically altering our cognitive landscapes.

These new theories about cognition and knowledge, encompassed within what is known as the Second Cognitive Revolution in psychology (Miller, 2003) clearly represent radical new ways of looking at what it means to be human, and how we see the relationship between man and machine.

Computational Thinking & Digital Thinking

Jeanette Wing, who introduced the term Computational Thinking, described it as “a universally applicable attitude and skill set everyone, not just computer scientists, would be eager to learn and use” (Wing, 2006). Nevertheless, and despite a growing receptivity to the idea that Computer Science should be considered a core subject in the curriculum (Grover & Pea, 2013), and despite the growing integration of ICTs in our lives, and educational system, Computational Thinking is still seen largely as part of the Computer Science curriculum rather than a mode, or disposition of thought which pervades all subject disciplines. Lee argues that there is a tendency to conflate Computational Thinking with the operational steps involved, with computer literacy or with information literacy (Lee, 2016). This misses the point that Computational Thinking is in essence the dispositions of thought that human beings use when they are using machines to automate thinking process, or doing cognitive offloading onto machines.

I think part of the difficulty lies in the use of the term computational. As Wing stresses, the aim of Computational Thinking is not to get human beings to think like computers, but to describe how human beings think when they are using machines to solve problems. The term computational is so firmly rooted in the mathematical and engineering sciences that it tends to preclude the use of computers for perhaps their most vital task, that of communicating and accessing knowledge. How knowledge is legitimated and disseminated lies at the centre of our cognitive identities, and the new digital technologies are in the process of transforming the rules of the game. I believe that it is more useful to speak of how human beings organise their thought processes around problem solving using machines as digital thinking rather than computational thinking. This semantic change allows us to conceptualise computers as communication as well as computational devices and to see the dispositions of thought as including both purposes.

Digital technologies have transformed our practices in any number of ways, but it is important to note that every shift in practice represents a mental shift as well. It changes how we approach problem solving in subtle but important ways.

Digital Thinking vs Analogue Thinking

The second move is to claim that human thought is undergoing something of a revolution, and that particular aspects of our cognition are being accentuated and transformed by the rapid advances in digital media and what this affords. These changes are sufficiently momentous, and the contours of such change sufficiently discernible that we can begin to talk about a difference between digital and analogue thinking. These differences are not absolute; we have not acquired new brains overnight, but they are polarities which are impacting on our cognitive lives. I believe that those aspects of our thought which are afforded by digital technologies have been growing in importance, in line with the development of the information economy and the new world order, which has altered our discursive landscape (Gee & Lankshear, 2006) and may well be altering our brains as well. Neuroplasticity ensures that changes in practice have an impact on our brains (Prensky, 2001b), and it must be assumed, our cognition also.

analogue v digital

That we are experiencing change is not in contention, but how to characterise that change is. There are perhaps two possible views on this. The first, what we might term the hard view, is that Digital Thinking represents a new mode of thought afforded by the new digital media. This modality of thought has significant points of difference to Analogue Thinking to warrant identifying a range of habits and dispositions of mind which signal successful practice. This view sees cognition as being revolutionised. Stevan Harnad (1991), for example, talks of the Fourth Cognitive Revolution, as described below, and this has obvious ramifications for our behaviour and habits. The soft view, on the other hand, sees cognition as being largely unaltered by digital media, but that new approaches and dispositions are needed for dealing with the differences inherent in using digital media. The illustration above attempts to characterise two polarities of analogue and digital thinking as cutting across the different modalities of thought we discussed previously (Paradigmatic, Narrative and Algorithmic thought).

I do not believe that we can assess either of these views, whether cognition itself is changing, or whether our habits of mind are changing does not essentially alter the schema which can be discerned emerging from the literature around digital literacies, cognitive psychology and computational thinking. In either event it makes sense to differentiate the contours of how we perceive the terrain as altered by the introduction of digital technologies.

In an always-on, always-connected world, as more and more devices become connected in the Internet of things, and as connectivity becomes more commonplace and more integral to our lives, many are asking questions about what this means for our cognitive development. Headlines scream that the Internet is making us dumb (Carr, 2011; Thompson, 2013) and neuroscientsists suggest that our brains are being re-wired (Siemens, 2014; Greenfield, 2014) for good or ill. Psychologists suggest that it is affecting our memories (Sparrow, Liu, & Wegner, 2011) or that it is affecting our social lives (Wooley, 2013). There is a sense in which we stand at a crossroads, uncertain of both the direction and meaning of the rapid changes which are shaping our society. Utopian dreams and dystopian fears permeate the debate around the extent to which technology is driving change. Advances in Artificial Intelligence raise the prospect that we will soon not be the only sentient beings on the planet, and usher in a Brave New World where the difference between Human and Machine may not be that clear cut.

I would argue that never before have we needed a roadmap for navigating the future as much as we do now. Moore’s Law (Gibbs, 2015) which states that the capacity for data storage will double every 18 months, while applying to data, is a metaphor for the rapidity of change, and expresses the sense in which the rate of change is exponential. I would argue that what is required is a new set of habits and dispositions, strategies for dealing with this change, and with the changes associated with the Internet revolution. Our old ways of thinking, the habits and dispositions of centuries may not be enough to guide us in the future.

The advent of the printing press saw the introduction of a medium of mass communication which has had far-reaching implications for our history, society, and thought. When books were only available to a select few, and knowledge dependent upon individual transmission, knowledge was hierarchical and legitimated by an appeal to authority alone. In the thirteenth century Peter Abelard initiated a revolution in knowledge (Campbell, 1991), an appeal to evidentiary support in challenge to authority, where even theology had to bow before reason, but it was only with the invention of the printing press, and the Industrial Revolution allowing for the mass production of books that knowledge could be distributed more widely, ushering in the ideological sea change represented by the Enlightenment. How knowledge is legitimated, disseminated and reproduced changes completely how individuals think, learn, create and solve problems.

I would argue that we are at a similar turning point represented by the new digital media. The ability to self-publish, afforded by the Internet, has changed how knowledge is disseminated, altering legitimation practices. Many academics publish material from their blogs, and the speed of publication ensures peer review, but not in the conventional sense. An example of the future is wikipedia, that repository of crowdsourced knowledge said to be as accurate as Encyclopedia Britannica. Stevan Harnad’s notion of academic “sky-writing” encapsulates the idea of the Internet affording the immediacy of oracy with the reflective power of literacy to bring about the Fourth Cognitive Revolution. Harnad’s (1991) schema is tabulated below.

First Cognitive Revolution Oracy Hundreds of thousands of years ago Communication immediate and interactive Growth of Wernicke’s area and Broca’s area in brain?
Second Cognitive Revolution Literacy Tens of thousands of years ago Allows for reflection between communications Growth of Exner’s area in brain?
Third Cognitive Revolution Printing Press c. 1450 AD Revolution of scale
Fourth Cognitive Revolution Internet c. 1990s AD Near synchronous communication: brings together immediacy and reflection Are our brains being rewired?

This periodisation carries with it a sense of cognitive affordance offered by the new digital media of near synchronous communication. Ideas can be disseminated rapidly, enabling quick responses and affording collaborative thinking, but with enough time for some reflection before responses are made. The last column, linked to possible neural developments is highly speculative, and Harnad himself makes only tentative suggestions in this regard, but feeds into a contemporary literature suggesting that the move from page to screen carries with it some neural rewiring.

News and opinions are shared rapidly across social media, replacing traditional news sources. In a world where breaking news is not mediated by the professional journalistic eye, but blogged, tweeted and facebooked by participants, we need to reassess how we access knowledge about the world. Content is similarly mediated by new mechanisms. In a world where opinions are discussed through the vehicle of the share, the like and the Internet meme, we need to see knowledge as a commodity that is rapidly traded across the globe in relatively unmediated ways.

I think we can begin to discern the basis for an argument that we are seeing a revolution in cognition, a shift away from analogue, towards digital modes of communication and cognition.

Digital media do not replace analogue channels. As a species we still rely on our senses. Nevertheless, digital media represented by Internet and other telecommunication technologies, represent new modes of communication which have far-reaching consequences for how we communicate, think, learn, create and solve problems.

How our thinking is mediated, and the types of cognitive offloading that are afforded by these technologies may be represented by the following chart which compares Digital and Analogue Contexts:

Thinking Digitally Analogue Thinking Discussion
Hyperspace The five senses Push technologies in particular afford the delivery of information from sources not limited to our biological five senses.

 

For example I learn what the weather is outside from my cell phone as opposed to looking out the window.

Fast paced Slow paced Acquiring data in an analogue world is slow and deliberate, gathering data in a digital world is almost immediate and less considered. Speed has advantages and disadvantages and calls for different approaches to evaluating data.

 

I can use a search engine to find out about something rather than going to a public library and finding a book on the subject.

Multi-channel (digital media streams) Single channel (biological senses) While data in the analogue world comes at us via the multiple channels of our senses, the digital world opens up even more channels, some of them referencing analogue channels, others entirely in cyberspace. This has advantages and disadvantages and calls for different approaches towards data collection.

As I type this, I am receiving data from my cell phone, emails and alerts are appearing in my inbox and I am able to use the Internet to search for information. This is on top of what I sense about the world around me through my biological senses, which tends towards a single channel (my senses).

Parallel Processing Serial Processing The speed at which different channels of communication can be accessed affords multi-tasking. Of course, given the severe limitations imposed on our working memory, we can never truly multi-task, we can certainly take on tasks simultaneously in the analogue world. However, the plethora of extra channels extends our ability to work on many levels at the same time.

I tend to take on more tasks in parallel while using digital media. Partly because it is more efficient to do so, but partly because my mind is parcelling out sub-tasks which will require time because they are asynchronous communications – for example posting a query for help in solving a problem on a forum. While I wait for a reply, I write an email and watch a Kitty video on YouTube.

Hypertextual Screen Reading Linear/narrative Page Reading Readings in hyperspace are hypertextual rather than linear or narrative. Hypertextual reading affords rapid synthesis and evaluation. Linear, narrative readings afford a closely argued train of thought.

Using a website I can rapidly skim for relevant information by following links and using the Find or search function. If I want to follow a train of thought I have to read sequentially.

Fast and Wide Thought Slow and Deep Thought Multi-tasking and hypertextual reading affords the rapid synthesis and evaluation of ideas, the identification of patterns and Fast Thought. Linear and narrative readings affords closely reasoned consideration of a train of thought.

Most decisions we make are the result of Fast Thought, but digital media afford this by making more information available and allowing us to find patterns more actively. Page reading, however, affords deep thought because I have to follow the sequence of words in the order they were written.

The Reader Creates The Text The Writer Creates The Text Hypertext is epistemologically predisposed towards a constructivist paradigm where the reader constructs their own meaning from the world around them. Texts are created by the reader, following links.

Text is predicated on an author communicating and transferring a message, and is epistemologically neutral. The text is created by the writer.

Constructivist Pedagogical Paradigm Instructivist Pedagogical Paradigm Theories about how we think and learn have changed considerably over the last hundred years or so, with a shift towards theories stressing how meaning is constructed rather than perceived.
Learning by participation Learning by acquisition Sfard (1998) has argued that there are two metaphors for learning and both are valid – learning by participation, and learning as acquisition.
Symbolic Manipulation Symbolic Creation Fast thought affords rapid symbol manipulation, while slow thought affords the creation of new symbols.

Digital media allow us to rapidly manipulate information in many forms, but page reading allows us to generate new concepts.

Computational (Digital)

(Algorithmic) Thinking

Narrative & Metaphoric Thinking Jerome Bruner has shown the importance of narrative in our cognition. If we think about an issue we tend to do so by relating it. We also think metaphorically, comparing things to something similar in our experience. Computational thinking is something different – it operates at the level of simulation and modelling.

Digital media allow us to create simulations by modelling a problem and playing it out rather than thinking through a solution by relating its contours. This produces unexpected and counterintuitive results.

This schema is presented as polarities of what must inevitably form a continuum. Nor do I believe that they form a rigid dichotomy. Far more accurate to say that there is a dialectical relationship between the two.

The Foundations Of Digital Thinking

This characterization of twin polarities, operating in a dialectical (dialogic) relationship rather than in opposition, allows us to argue that there are several elements in human thought which, if not new, are at least being emphasised and foregrounded by the growing centrality of computers and ICTs in our lives. If we accept that even more important than our abilities and capacities are our dispositions, our ingrained behaviours and responses to the problems and challenges we face, it becomes clear that how we react to this shift in technology from analogue to digital is crucial in what we can accomplish.

What elements then constitute the dispositions of successful users of computational and digital media? If we look at successful use of digital media I think we can start to identify a number of elements.

Embracing Change

We all know that some people seem to embrace change and use it to their advantage, while others appear phased by change and shy away from it, or appear less imaginative and productive in the ways in which they use it. This has been falsely framed as a generational divide (Prensky, 2001a); that those born after 1985 (digital natives) are somehow wired differently, and have a natural affinity with ICTs which older people (digital immigrants) lack. While our brains may well be wired differently, if I am following this debate correctly, neural plasticity implies that all users of new technologies, regardless of age, will experience changes. There is a great deal of research indicating that this notion of a differently wired, differently thinking generation is a myth (Kirschner & van Merriënboer, 2013). There is no evidence to suggest, for example, that a new generation of youngsters are predisposed, somehow, to self-directed, self-paced, collaborative learning because their brains are somehow wired this way.

The divide is really one of disposition. Some youngsters are “immigrants” as much as some older people, and some older people are completely at home with technology. Exposure to, and comfort with a technology does not necessarily imply a capacity to use the technology in the most beneficial ways, in and of itself. For example, technology may afford self-directed learning, but it is one’s dispositions which determine whether or not these opportunities will be taken up. These dispositions generally need to be taught, and herein lies the central paradox of education. Discovery learning, as attractive as it sounds is ultimately a deeply flawed idea because of the scholar’s dilemma; how can you discover something until you know it is there to be discovered?

For example. Our brains, constrained by cognitive load, limit the ability to multi-task (Kirschner & van Merriënboer, 2013; Sweller, 1988). Our dispositions and habits, however, may allow us to handle multiple tasks with greater facility, overcoming the limitations of our biology to some extent.

Successful thinkers tend to embrace change rather than balk at it. Thinking Digitally means that you are constantly ready to adapt to change, and see it as an opportunity rather than a constraint.

Production vs Consumption

They say a picture is worth a thousand words. There is all the difference in the world between creating and editing pictures and viewing them. It is not, however, that the one is good and the other bad. There are skills and sensibilities, dispositions and capacities which are involved in both activities. To read a picture effectively requires a thorough understanding of the genres and contexts in which the picture was taken. Producing a picture also requires skills and abilities, understandings and literacies. I would argue then that we cannot privilege authorship above criticism. Both have value and a place. We are all at some stage or another a producer or a consumer of both analogue and digital technologies and products.

Nevertheless, there are good habits and dispositions around production, and likewise for consumption. These dispositions are likely to be very similar in analogue and digital contexts. Having said this, however, we need to highlight one vital difference.

All technologies require knowledge and skills both around production and consumption. But some technologies are clearly more complex than others and require specialised skills and knowledge. While almost everybody learns to read and write, very few learn to write software code. While most people can use a screwdriver, relatively fewer can use a soldering iron. This complexity factor limits and constrains the use of all technology. But digital technology represents a level of complexity which makes this division between people who can and people who can’t more stark. It is relatively rare to hear someone say of any analogue technology that it is simply beyond them and they cannot use it in any way. And yet this attitude towards digital technology is frequently heard.

A person who is relatively handy can tinker with, and fix a range of analogue technologies without ever being an expert in any of them. And yet, increasingly this tinkering is not possible with digital technologies as the technology requires such specialised skills and knowledge that the ordinary “handyman” cannot fix a problem.

I am no motor mechanic, but even I have been able to get a Volkswagen Beetle motor running again with a little logic, a prayer, and a spanner. There is no way that I could do the same with modern cars with their electronic complexity and sealed units. The growing complexity of our machines threatens to turn us all into helpless consumers, and robs us of any ability to tweak and tinker which an essential part of being an effective producer.

Personal computers were first introduced very much within a tinkering culture; the first computers were shipped with no applications – users had to create their own programs. These days the reverse is true – computers and devices are loaded with a wealth of applications, and programming is both unnecessary and discouraged. Production within each application is further consumerized by a move towards authorship using templates and wizards. Computing has moved from a strong producer ethos to one which encourages consumption. The rapid growth in computer usage explains this shift, but I would argue that as the user base expands so does the need to actively encourage producer dispositions, which are being lost by the ease of consumption.

Growing social complexity and knowledge specialization means that increasingly we are reliant on others to produce and fix, what formerly we could tinker with ourselves. This leaves us often helpless. While knowledge and skills specialization accounts for some of this movement from producer to consumer cultures, much of it is driven by underlying habits and dispositions. We can see in the Maker Movement (“Maker culture – Wikipedia, the free encyclopedia,” 2016) and in the thinking behind Computational Thinking, a growing reaction to this trend. As analogue technologies become increasingly digitized, I would argue that the need to foster producer mentalities and dispositions should be seen as a core aspect of Digital Thinking.

Successful thinkers display dispositions towards producer mentalities. Thinking Digitally means that you are able to Hack Your Life.

Creativity vs Drudgery

We face a future in which Artificial Intelligence will transform all aspects of our economy and society (Cellan-Jones, 2014). While we do not know what the future holds in store, I would argue that we need to see the growing technologization of society as an opportunity, and need to emphasise a disposition towards creativity rather than drudgery. Technology may be used to free up our lives for leisure and creativity, or for drudgery and work. Andrew Feenberg (1991) described the central contradictions between the potentialities for control and democratization inherent in technology, and pivotal to speculation over the role of technology has been dystopian and utopian visions of the future linked largely to these polarities.

Do we face a future in which a few use technology to control and pacify the many, in which humanity at large is debased by a digital divide in which they are passed over, or will technology free the majority from lives of drudgery to unlock their full potential? This is very much a political choice, and depends upon our social organization. The sharp debates around the politics of Austerity reveal the contours of the choices we have to make within the new world order, whether to accept Neo-Liberal Taylorist arguments about the nature of Capital, or to seek Socialist alternatives and a more equal distribution of wealth. The stark choice is whether we valorize capital above labour, or see technology as an opportunity to unlock human potential in ways not previously imaginable.

We urgently need a disposition towards creativity over drudgery in the ways in which we approach our use of technology.

Jane McGonigal (2011) has written about what she calls the four superpowers of games, and how these powers may make a better world. These superpowers form perhaps the dispositions of successful gamers. McGonigal argues they are the crucial dispositions for life itself. They are:

  1. Urgent Optimism
  2. Blissful productivity
  3. Social Fabric
  4. Epic Meaning

Her characterization of gaming as empowering in gamers a hopeful expectation of success in conjunction with productive work within a social network to accomplish meaningful tasks expresses a disposition towards optimistic, humanistic outcomes which resonates with a view of human purpose which is vastly at odds with the Malthusian vision of humanity as mere cogs in a machine.

Successful thinkers display dispositions towards creativity and unlocking their human potential. This requires constantly finding new ways to express yourself through technology, rather than allowing it to be used to marginalise your humanity. Thinking Digitally means that you are alive to the human within the machine.

Conclusion

In this exploratory paper I have attempted to argue that the most crucial element of our emerging relationship with technology is the dispositions and habits of mind we bring to our use of machines to automate and extend our thinking. The digital divide is often conceptualised as a divide around access to devices, but in reality it is a mental divide between those whose dispositions empower human creativity and problem solving, and those whose habits of mind limit the realm of the possible.

Bibliography

Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Castells, M. (2011). The Rise of the Network Society: The Information Age: Economy, Society, and Culture. John Wiley & Sons.

Cellan-Jones, R. (2014). Stephen Hawking warns artificial intelligence could end mankind – BBC News. Retrieved April 8, 2016, from http://www.bbc.com/news/technology-30290540

Clark, A., & Chalmers, D. J. (2010). The Extended Mind. In The Extended Mind (pp. 26–42).

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Dror, I. E. (2007). Cognitive Technologies and the Pragmatics of Cognition. John Benjamins Publishing.

Dror, I. E., & Harnad, S. R. (2008). Cognition Distributed: How Cognitive Technology Extends Our Minds. John Benjamins Publishing.

Feenberg, A. (1991). Critical theory of technology. Oxford University Press, USA.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Kahneman, D. (2011). Thinking, Fast and Slow. Penguin UK.

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do Learners Really Know Best? Urban Legends in Education. Educational Psychologist, 48(3), 169–183.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Maker culture – Wikipedia, the free encyclopedia. (2016, March). Retrieved April 8, 2016, from https://en.wikipedia.org/wiki/Maker_culture

McGonigal, J. (2011). Reality is Broken: Why Games Make Us Better and How They Can Change the World. Random House.

Miller, G. A. (2003). The cognitive revolution: a historical perspective. Trends in Cognitive Sciences, 7(3), 141–144.

Prensky, M. (2001a). Digital Natives, Digital Immigrants: A New Way to Look at Ourselves and Our Kids.

Prensky, M. (2001b). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Reading A Book or An Hour of Code? (2016, March 22). Retrieved May 10, 2016, from https://lh3.googleusercontent.com/-bQWsEynX-8A/VgGCHw2AdrI/AAAAAAABWTU/_y9VVvBQYgM/w1728-h1728/computational-thinking-white-bg.jpg

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A. (1998). On Two Metaphors for Learning and the Dangers of Choosing Just One. Educational Researcher , 27(2), 4.

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

(Bruner 1991)Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Prensky, M. (2001). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A., & Anna, S. (2014). Metaphors for Learning. In Encyclopedia of Science Education (pp. 1–4).

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

 
Comments Off on Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Posted by on December 6, 2019 in Computational Thinking, Critical thinking, Habits Of Mind, Learning Theories, Pedagogy, Thinking Digitally, Thinking Skills

 

Teaching ICTs – Computational Thinking Pedagogies & Thinking Strategies

As more and more schools start teaching computing explicitly in one form or another to all students, the focus moves from advocacy, getting computing for all into schools, towards pedagogy – how best to teach it. There are a number of pedagogical approaches and cognitive strategies that have been promoted, but I would like to look at just one which I have found effective over the years. I think it would be fair to say that many teachers of ICTs are self-taught, and perhaps for this reason, the discovery method is still remarkably popular. The influence of Seymour Papert’s Constructionism, allied with Constructivist learning principles has made direct instruction far less prevalent in the computer class than in perhaps any other classroom. Nevertheless, a lack of direct instruction, I believe, threatens to undo many of the benefits of a discovery learning framework.

Discovery learning is predicated on setting problems which will allow students to learn through grappling with trying to find solutions, drawing on their past learning and knowledge. But there is the danger that no learning will take place at all if there is no scaffolding of the discovery process, and ironically, the greater the reliance on student-centered learning, the greater the need for directed teacher interventions. The greatest weakness of the discovery method is encapsulated in the scholar’s dilemma. How do you discover something that you don’t know exists? You need a more experienced other to, at the very least, nudge you in the right direction.

Teachers who use discovery learning, need to be careful to make sure that students have the tools they need to learn something from problem-based approaches. This scaffolding can take many forms, but without it, learning is a very hit and miss affair. Teachers need to adopt a range of strategies to scaffold the kind of knowledge that will feed into problem based learning activities. For example you can carefully guide students through base skills and knowledge needed, and then set more open-ended projects that build on this learning. Or you can set open-ended projects and make knowledge available where needed as students explore what they need to discover to solve the problems set. Alternatively you can pursue a mix of these approaches, giving some instruction up-front, and then supplementing knowledge where it is needed.

But another approach is also possible. If students are working in groups, you can use a jigsaw technique. Students can be split into expert groups and work on guided assignments so that they become experts in one aspect of the overall task. Groups are then formed by taking a member from each expert group so that each group has multiple experts in different aspects applicable to the task. They then teach each other what needs to be known to perform the task. For example, if a project uses spreadsheets, one expert can be trained in formatting cells, another in creating formulae, another in functions, one in graphing, and so on. When they combine they should be able to tackle a task which requires all these skill sets, and students should learn from each other.

An alternative approach is to have groups work on sub-tasks which combine to solve a larger problem. Each group works on a task which, combined, comes together to provide an elegant solution. For example one group might work on moving a robot, another on the operation of its crane, and so on. These approaches do, however, require different students receiving different instructional input. This can be achieved using a flipped classroom model. Even when I cover material in class, I make a video of the content and post it  on my Learning Management System so that it accessible to everyone, even if a student did not do that particular task at the time.

Having said this, managing different instruction for different students is a logistical problem. I prefer to have a wide range of tasks and either allow students to choose tasks they wish to pursue, or to focus tasks around building capacity for particular projects. An example would be the task below. By preparing task cards which set out the brief for any task, but also includes a hint about how to proceed, students can work at their own pace, or use the tasks to build up the skills needed to tackle larger projects. The task card may include very detailed steps to follow, or simply hints.

I have gamified the tasks, which is not necessary, but does, I believe, add a certain something. If you allocate Experience Points (XP) for completing tasks you can use the number of XP earned to unlock larger projects or challenges much in the way that XP allows characters to level-up in RPGs. This ensures that students complete as many tasks as possible, hopefully accumulating base skills needed for the challenges and projects. Because the hints and instructions are on the the reverse side of the task card, as a teacher you do not need to do a great deal of whole class instruction. Jigsaw groups, or sub-task groups can work relatively independently, typically after a whole class session when introducing a new application. If students are still stuck I am able to answer individual queries in class, and there are always the videos as back-up.

 

Computational Thinking – a new modality of thought or just what coders do?

I want to pose a question for consideration. There is a great deal of debate and disagreement over what Computational Thinking means. For some it describes how computer scientists go about what they do, akin perhaps to the scientific method for scientists (Wolfram, 2002), and is applicable only to computer scientists. For others it is a skill set that has implications beyond the field of computer science, a set of generalizable skills of benefit to all (Wing, 2006). A third view is that it represents something of a new mode of thought capable of unique explanations (Papert, 1980) and knowledge building. In this sense it goes beyond a set of procedures, like the scientific method, and might represent a mode of thought distinct from the paradigmatic (argumentative) and narrative modes of thought proposed by Bruner (1986).

The paradigmatic mode represents knowledge founded on abstract understanding or conceptions of the world,. For example, I could explain why an apple fell to the ground by referencing the theory of gravity. This is largely the language and understanding of Science. The narrative mode of thought represents an understanding of the world founded in human interactions. I might explain why an apple fell by referencing a sequence of events in which my elbow knocked it off the table and I was not deft enough to catch it. Of course there is a continuum along which both modalities of thought intersect and interweave. So, my question is whether computational thinking represents a separate mode of thought in its own right, or simply new combinations of paradigmatic and narrative modes. If I were to model a world of apples, elbows and tables, my understanding of why apples fall might be based on a more complete understanding of how apples behave under different circumstances. The use of computational models allows for new ways of understanding the world, new ways of gaining understanding and knowledge. Chaos Theory, for example, emerged out of computational model building. Paradigmatic formulations of the world followed from computational modelling, rather than the other way round.

When we create a computational model of a weather system and run our algorithms through computers with slightly different inputs to make a hurricane path forecast, for example, or use machine learning algorithms to predict heart disease more accurately, are we deploying a new kind of thought which is somewhat different from both paradigmatic and narrative modes?

The need to ask this question rests, perhaps, on the rapid development of Machine Learning and how it threatens to disrupt our world. Machine Learning has brought us to a point where we might soon be farming most of our thinking to intelligent machines. And while probabilistic approaches to artificial intelligence allow human beings to trace back what the machine has done with our algorithms: neural networks, with their black box approaches represent thinking that is to a large extent opaque to us. It seems entirely possible then, that in the not too distant future machines will be delivering to us knowledge of the world, and we will not be able to explain the thinking behind it.

The idea of Computational Thinking (CT) has a history, and it is interesting to unpack some of it. The term was coined by Seymour Papert (1980) and popularised by Jeanette Wing (2006) and there is general consensus that it refers to the thinking skills employed by computer scientists when they are doing computer programming, derived from the cognitive processes involved when you are designing an algorithm for getting “an information-processing agent” (Cuny, et al, 2010) to find a solution to a problem. For some, information-processing agents should refer only to machines, but for others it could include human beings when they are performing computational tasks. Differences in how applicable CT is beyond computer science hinges on these nuances of understanding. I have often heard it said that getting students to design an algorithm for making a cup of tea represents CT and if students were to study designing algorithms through leaning to code they would therefore be improving their general problem solving skills.These claims are difficult to assess, but they are important because if CT applies only to the context of computer science, then its place in the curriculum occupies something of a niche, important though it might be. If, however, as claimed, it leads to benefits in general problem solving skills there is a solid case to be made for getting all students to learn programming. Equally, the case for exposing all students to some coding might rest on other claims unrelated to the transfer of CT to other domains.

Let’s start by looking at the claims made by the Coding for all lobby. Wing (2206) argued that CT skills have transferable benefits outside of computer science itself because they entail five cognitive processes, namely:

  1. Problem reformulation – reframing a problem so that it becomes solvable and familiar.
  2. Recursion – constructing a system incrementally on preceding information
  3. Decomposition – breaking the problem down into manageable bites.
  4. Abstraction – modelling the salient features of a complex system
  5. Systemic testing – taking purposeful actions to derive solutions  (Shute, et al, 2017)

Wing’s claim has received a great deal of attention and has become the bedrock for the Computer Science for All movement, the idea that all children should be exposed to CT, by teaching them to code, both because such skills will become increasingly important in an increasingly digital world, but also because they equip students for the kinds of problem solving that is increasingly important. It is debatable, though, as to whether these cognitive processes are unique to computational thought. Abstraction and decomposition, in particular, might seem to be thinking processes shared by any number of activities. Wing’s thesis that computational thinking is generalizable to all other fields could perhaps be stated in the reverse direction. Perhaps general cognitive processes are generalizable to computation? This point is not trivial, but still might not threaten the thesis that learning to code or create algorithms is excellent for developing good problem solving skills applicable to other fields.

The question of the transfer of skills gained in one context to another is, however, fraught with difficulty. Generally speaking it seems to me that knowledge and skills are gained within the framework of a particular discipline, and that the application of knowledge and skills in other contexts is always problematic to some extent. There is a close relationship between knowledge itself and what we call thinking skills. It is hard to imagine, for example, anyone possessing dispositions and thinking skills in History or Mathematics without possessing knowledge in those disciplines. As Karl Maton (2014) has pointed out, all knowledge has both knowledge and knowing structures. There is the stuff that is known and the gaze of the knower. In different fields, knowledge structures or knower structures may have greater or lesser relative importance, but one cannot distill out something which is pure knowledge, or pure knowing. Therefore the question of the transfer of skills from one context to another, from one field to another, is not a simple one. Of course we do achieve this feat. At some point in my life I learned basic numeracy skills, within the context of elementary arithmetic classes presumably, and I have been able to apply this basic knowledge and skill set to other contexts, for example computer programming. But I am not so sure that the thinking dispositions I gained while studying History at University, and my appreciation for the narrative mode of explanation are altogether much use when thinking about Computational Thinking and what I ought to be doing as a teacher of ICT skills. I am painfully aware that there are limits to the general applicability of the enquiry and data analysis skills that I learned when training to become an historian. I did not train to become a computer scientist, and therefore I am very wary of commenting on how transferable skills in computational thinking might be to contexts outside the field. But I do believe we should be wary of claims of this sort. Peter Denning (2017) has argued that the idea that all people can benefit from CT, from thinking like computer scientists, is a vague and unsubstantiated claim. For Denning, the design of algorithms (algorithmic thinking) rests not on merely setting out any series of steps, but speaks specifically to the design of steps controlling a computational model. It is context bound.

My understanding from this is that the case for teaching everyone to code cannot rest solely on an argument that CT transfers benefits. This case has yet to be proven. It does not mean that teaching coding to all is not a good thing. I believe that learning to code represents a rigorous discipline which is good for the mind, has benefits because we are living in a world where computer programs are increasingly important, and because coding involves problem solving and this too benefits the mind. All in all I think the case for teaching coding to all is extremely cogent.

I also have this sneaking suspicion that the question I posed in my opening remarks is going to be raised more and more frequently as artificial intelligence gets applied, and if so, having a population trained in some level of competence with computational thinking is probably a really good idea.

Bibliography

Bruner, J. (1986). Actual Minds, Possible Worlds. Cambridge, Mass: Harvard University Press.

Cuny, Jan,  Snyder, Larry, and Wing, Jeanette. 2010. “Demystifying Computational Thinking for Non-Computer Scientists,” work in progress.

Curzon, Paul, Tim Bell, Jane Waite, and Mark Dorling. 2019. “Computational Thinking.” In The Cambridge Handbook of Computing Education Research, edited by S.A. Fincher and A.V. Robins, 513–46. Cambridge. https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/57010/Curzon Computational thinking 2019 Accepted.pdf?sequence=2&isAllowed=y.

Denning, Peter J. 2017. “Remaining Trouble Spots with Computational Thinking.” Communications of the ACM 60 (6): 33–39. https://doi.org/10.1145/2998438.

Guzdial, M. 2011. “A Definition of Computational Thinking from Jeannette Wing.” Computing Education Research Blog. 2011. https://computinged.wordpress.com/2011/03/22/a-definition-of-computational-thinking-from-jeanette-wing/.

Maton, K. (2014). Knowledge and Knowers: Towards a realist sociology of education. London, UK: Routledge/Taylor & Francis Group.

Papert, Seymour. 1980. Mindstorms: Children, Computers, and Powerful Ideas. The British Journal of Psychiatry. New York: Basic Books. https://doi.org/10.1192/bjp.112.483.211-a.

Shute, Valerie J., Chen Sun, and Jodi Asbell-Clarke. 2017. “Demystifying Computational Thinking.” Educational Research Review 22 (September): 142–58. https://doi.org/10.1016/j.edurev.2017.09.003.

Wing, Jeannette. 2006. “Computational Thinking.” Communications of the ACM 49 (3): 33–35. https://doi.org/10.1145/1118178.1118215.

Wolfram, Stephen. 2002. A New Kind of Science, Wolfram Media, Inc. https://www.wolframscience.com/nks/

 

The Matrix Game as a Thinking Tool

Matrix Games (developed by Chris Engle) are an excellent classroom tool. A handbook detailing their application in education and other professional settings has recently been published. They represent a flexible and easy to use game mechanism for any role play or simulation setting. In brief, players (either individually or in teams) make arguments about what they wish to do and why this should happen. An umpire then assesses these arguments and evaluates how likely they are to succeed. A die is rolled to simulate luck and the argument either happens or fails to happen. Games can be relatively free-flowing or more structured depending on the context and desired result.

So, for example in a History class in a game simulating the Russian Revolution, the Bolshevik player might argue that the First Machine Gun Regiment would lead an assault on the Winter Palace, with the backing and support of the party and that this would succeed because the regiment was well armed and prepared, was militarised and supported the party and because the palace was weakly defended by troops whose loyalty was suspect. The umpire might rule this argument average, giving it a 50% chance of success. The umpire’s reasons for ruling this way might be that although the First Machine Gun Regiment historically did indeed ask the Bolsheviks to take action, the Bolshevik leaders turned down their request to begin the revolution, fearing it was premature. Each player makes arguments which either succeed or fail and the game develops in his way along its own trajectory. In this example it is likely that the players will develop a better understanding of the background and the forces at play in shaping the outcome of the revolution. The Matrix Game is an excellent tool for running simulations in the classroom, but I want to argue that the Matrix Game represents a cognitive tool in its own right and should be added to every teachers’ tool box. The Matrix Game supports two major cognitive processes: thinking and communicating with clarity and precision and listening with empathy and understanding.

Thinking and Communicating with Clarity & Precision:

When advancing an argument, players need to think about what they want to happen and the reasons why this action will be successful. This can be scaffolded by providing a matrix of reasons (which is why it is called a matrix game), but usually players use the matrix of the real world or of imaginary worlds to draw on for supporting their arguments. In this way the argument represents the conclusion and the matrix of reasons the premises for any logical argument. The form of the Matrix game thus forces players to think in logical and coherent ways about what they want to argue and why it should work.

Empathy & Understanding:

Players assume different roles and compete against other players to have their perspectives advanced. This necessitates understanding the world from another’s point of view and simulating action from that perspective. I am currently running a game in which different interest groups compete to mine the moon. Each team makes arguments from their own perspective. By setting different victory conditions for each team, the players are scaffolded in framing their actions from a certain point of view. See the graphic on the right.

Because different teams have different criteria for winning they will try to manoeuvre game outcomes in their favour. I like to set Victory Points at 2 or 1 points for different outcomes. A player can claim a win on 2 points, or a partial win on 1 point. They lose if they achieve none of their victory conditions. This encourages players to work with others to reach compromises.

I believe that this mechanism helps students to develop an understanding of different perspectives in ways which encourage a much deeper appreciation of how actions are founded on world view and one’s standpoint.

Setting up a Matrix Game

Matrix Games are easy to set up. All you need is a context and roles for players to simulate. You can set up role play cue cards as depicted for the Mining The Moon game, or allow players to define their own roles. Play normally proceeds in turns during which each player gets a chance to advance an argument, but you can adapt this to suit your needs quite easily. For example, I sometimes let players submit an argument whenever they wish to, but then I make them submit in writing and adjudicate in the order received.

When umpiring arguments it is a good idea to assume average as a starting point and then decide if it is weaker or stronger based on criteria more directly linked to the curriculum. Go with your gut instinct. I always try to reward greater understanding of a context and give reasons why I am ruling something weaker or stronger. Adjudication, of course, is always done in terms of what has succeeded in the game. You cannot have one argument cancel out another. Arguments that support other arguments closely are automatically very strong.

If you try out a Matrix Game in your classroom, please drop a note in the comments. It would be great to hear your experience.

 

 

Five Apps that Support Group Work in the Classroom

Constructivist Learning Theory emphasises the value of Group Work in the Classroom. As Vygotsky has highlighted, learning is first social before it becomes internalised. In other words the more opportunities students have to discuss and work through any content, the greater the opportunity to internalise that content. And yet many students have a hatred of group work. Learning to work with other people is not easy. Those with a healthy work ethic often do not know how to handle interactions with those who have less of a motivation to finish a task. Those who are used to achieving high marks for their individual assignments often feel resentful towards those who who turn in work they consider drags them down. Should they just take over and do all the work themselves, or do they accept peer contributions which they consider sub-standard? Others in the group may be resentful of those who try to take over, or who come across as bossy or exacting. And yet, more than ever, learning to work together and think interdependently is considered a crucial and employable skill.

Are there any digital applications which can help quieten the choppy Group Work waters? Here are five suggestions.

1. Google Docs

Google Docs provide unparalleled functionality for facilitating collaborative text authoring. A document can be shared with all members of the group, and the teacher, and then all who have been given editing rights can simultaneously work on the document. All changes are saved automatically. There is an online chat facility, and authors can leave comments and suggest edits. One of the greatest limitations on collaboration has always been the difficulties around sharing a document and writing one up. One member of the group often had to volunteer to do the “write up”. Google docs allows for this workload to be shared.

Teachers can carefully scaffold tasks within a Google doc and then share the document with a group so that the steps to be taken are highlighted, and strategies which might be deployed to afford collaborative thinking are suggested. In the graphic, the teacher is suggesting that de Bono Thinking Hats might help the group explore explore the topic though parallel thinking. Teachers can comment at any stage during the authoring process much as teachers circulating in a classroom can eavesdrop and intervene where necessary to get a group back on track. This allows teachers to  continue scaffolding learning in class, and outside class while students are authoring their write up.

These affordances for collaborative authoring and scaffolding make Google docs one of the most valuable educational tools to emerge in recent years. Students are able to use Google docs both while in group discussion, and for after-school homework.

2. Bubbl.us

Bubbl.us is a web-based tool, with limited free and paid options. It allows users to set up a mind-map board which groups discussing a topic can use to create mind maps and save these as a jpeg, png or even html, which can be downloaded and shared. Upgrading to paid versions allows users to share a mind-map which can then be used for follow-up tasks.

One of the limitations of any paper based mind-map is how to share it, if the ideas are needed for follow-up action. To my mind, mind mapping tools offer the key affordance of guiding discussion around how ideas fit together. It forces students to address issues such as where does this idea fit? This helps sharpen an argument.

Bubbl.us allows grid, tree and bubble layouts. You can insert files only with an upgraded paid version, but the free version does allow links, so students can use the mind map to record useful links.

Some way of recording a discussion in a form which can later be shared is invaluable, but mind maps are especially valuable because they force students to simultaneously organise their thoughts.

3. Padlet

Padlet is a web-based tool which has free and paid options. The free version allows up to four walls. On a wall you can add files, voice and video recordings, links searched from within Google, text and doodles. You can share the wall with other users, each with authoring rights, or share a link, or wall saved as pdf or image.

The chief affordance to my mind is the facility for co-authors to add voice or video messages to the wall. This provides a superb tool for a group to collect resources and leave commentary both while planning a project, and when leaving a report back, with group members recording commentary on different aspects of a topic.

A teacher can set up a topic and invite students to co-author a document, thus setting up a group, and providing impetus sources if required, or groups can set up their own walls and share with each other informally, or with the teacher, formally. Walls have different themes and templates which can be applied. A wall can be deleted when it is no longer needed.

4. Kahoot!

Kahoot! is a tool which allows students to create quizzes or games or discussion boards which they can then share with the rest of the class. This is a great end product which encourages a group to research a topic, master the content and share with the class in the form of a quiz. Students find Kahoots engaging to create and to consume. This provides one means a teacher can use to ensure that the end product is itself engaging and encourages the group to take care in its creation.

Thinking of suitable questions to ask the rest of the class is a great way to get students to dig down deeper into a topic than they might otherwise have done so. The competitive nature of the quizzes also seems to encourage students to put in greater effort.

5. Lino

Lino is a sticky note web-based application which allows multiple users to post sticky notes on a topic. Users can post files, links to videos or images on an electronic cork-board. This allows for group-based brainstorming. It is a very versatile tool in that it can be used by a group or whole class and used for multiple purposes from group discussion through to presentation and feedback or reflection.

I like to use it as a reflection tool for students to post final comments on a topic after group-based feedback presentations have been made. It is quick and visual  and allows for a rapid round-up of reflections or comments and makes for a good way to sign off on a topic.

For a teacher it is a good way to spot any comments which reveal need for further action. Maybe some aspect of the topic needs to be picked up on at a later stage, or could do with further exploration.

This list of tools is by no means exhaustive. There may be better examples of applications with improved functionality. All of these tools, however, represent different ways in which collaborative group-based work can be usefully supported and enhanced. Please use the comments to suggest other tools, or share how you are using these tools.

 

 

 

 

 

 

 

Critical Thinking & ICTs – Part 1

critical-thinking-cartoonThere is a narrative which says that ICTs offer unique affordances for critical thinking in the classroom. This argument sees the introduction of new technologies in the classroom as a prerequisite for a new emphasis on critical thinking. The 21st Century Skills Movement sees change itself as a rationale for the need for critical thinking, and technology as a central skill set for success in a changing world.

Now, this blog is dedicated to exploring how ICTs and Critical Thinking intersect, so I have rehearsed elements of this narrative many times. I do believe that ICTs have affordances which can be leveraged to achieve greater critical thinking, but the relationship is not simple or direct, and I have been around long enough to remember when teachers sought to foster critical thinking quite independently of digital technologies. As one who considers himself a champion of ICTs and Critical Thinking I believe it is important to have a theoretical framework for understanding the relationship between critical thinking and technology adoption which helps us to understand better how we can use technology to build better critical thinking.

Thinking around what critical thinking means is often somewhat woolly. For some students it appears to come naturally. Their arguments are well structured, well supported, with greater nuance and generative power. Other students struggle to present or analyze ideas effectively, and teachers are often unsure exactly what to do to help improve thinking. What exactly does effective thinking look like anyway?

Many teachers are using particular thinking strategies to foster critical thinking. Tools such as De Bono’s Thinking Hats, David Hyerle’s Thinking Maps, Harvard’s Visible Thinking or Tony Ryan’s Thinker’s Keys are designed to provide particular pathways to better thinking. These strategies represent pedagogies claiming to offer affordances for critical thinking in much the same way as claims are made that ICTs afford critical thinking. The claims for these strategies rest on the affordances of specific thought processes. For example the Thinking Maps offer scaffolding for promoting defining, describing, comparing and contrasting, classifying, sequencing, analyzing cause and effect, identifying part/whole relationships and seeing analogies. The Thinking Hats are said to maximise and organize thoughts and ideas by deploying parallel thinking techniques. The Visible Thinking routines represent attempts to increase metacognitive awareness, for example to draw on previous knowledge, explore diverse perspectives or deploy active reasoning or explanation. These cognitive strategies represent something of a toolbox. Much as a DIY handyman reaches for a specific tool to tighten a bolt or screw, remove a nail or fill a hole, particular cognitive tools can be used for different cognitive purposes. The teacher’s job becomes that of modelling and scaffolding student’s thinking, helping students recognise which tools are appropriate for what purpose and how to use them effectively to improve their thinking so that increasingly students are able to use these tools appropriately without prompting.

This way of looking at critical thinking is not the only way to conceive of it, but it is a useful metaphor for teachers and offers a focused approach which teachers can apply in their classrooms. The question is, is there a similar way we can think about how ICTs may be used as tools for cognitive education?

blooms_digital_taxonomySimilar approaches have been tried. For example Bloom’s Digital Taxonomy represents an attempt to map digital tools to Lower Order and Higher Order Thinking Skills. So, for example podcasting is seen as a Higher Order Thinking Skill of Creating, while Social Bookmarking is seen as a Lower Order Thinking Skill of Remembering. What this model lacks, however, is a nuanced understanding that tools in themselves do not mean much, it is how they are used, and for what purpose, that is important. One can use twitter, for example, at every level of Bloom’s taxonomy. One-to-one mapping of tools to a taxonomy of thinking regardless of purpose and use does not make much sense. Digital tools are not, therefore, the same as the cognitive tools described above. Any framework for digital cognitive tools needs to include their use and purpose.

For example, Google docs carry massive affordances for collaborative thinking. Students can collaborate on writing or problem solving tasks, using comment and joint editing to develop ideas collaboratively. But twitter can also be used in this way, and so can Skype, and many other tools. Google docs can also be used in ways which do not display collaborative thinking at all! Over the course of the last few decades teachers have identified uses of technology which can be used to aid cognitive processes such as collaborative thinking. It seems to me that any framework of cognitive digital tools needs to focus on the cognitive purpose rather than the technology. A useful approach would be to look at teaching practice and try to map cognitive digital tools to thinking processes. In order to do this, however, we need a much less woolly framework for understanding cognitive processes.

There are many different frameworks for critical thinking. I would like to detail just a few below, and then suggest a way forward.

Bloom’s Taxonomy

revised_taxonomyBloom’s (1956) taxonomy of the cognitive domain remains the standard framework for thinking about thinking in the classroom. It establishes six levels of cognitive processes which are seen as moving from simpler to more complex skills. The model has been revised by Anderson, Krathwohl, et al (2001), and both models are widely taught in pre-service teacher education and represent something of a lingua franca in the educational world. This is a considerable strength in that it is already the most commonly used framework by teachers concerned with cognitive education. However, I have to say that it is not a particularly generative model, and in my estimation is often used simply, and mechanistically to rationalise what is done in the classroom rather than to drive critical thinking. Because categories of cognition are not in reality discrete, the exercise of identifying levels is somewhat meaningless, and the pedagogical purpose of doing so unclear.

The model does not drill down to thinking routines themselves. Analysis, for example implies an ability to differentiate between premise and conclusion, what constitutes evidence, how to expose logical flaws, and so on. But the model tends to obscure this rather than highlight it. To my mind Bloom’s model ends up being a limiting factor in promoting critical thinking in the classroom. The taxonomy emerged as part of a movement to clearly define educational objectives and remove woolly thinking, but is in fact far more obscurational than the liberal tradition it replaced.

As we have seen with Bloom’s digital taxonomy, this woolliness both in the cognitive domain and how they map to digital tools renders the framework somewhat vague. What does it really mean when a teacher says, for example, that they are using blogs to enhance student capacity for creating?

The Paul – Elder Approach

pe-critical-thinking-modelThe Paul-Elder framework attempts to draw up a three-tiered model for Critical Thinking, defined as “the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.” (Scriven & Paul, 2003). The model is based on the structures of thought, universal intellectual standards and intellectual traits exhibited by critical thinkers.

The strength of the model is that it does not focus on discrete thinking routines alone, but integrates the habits and dispositions of successful thinkers into the framework, and that it does manage to drill down to the elements of reasoning directly. Its major downside is its very complexity. For all its faults, Bloom’s taxonomy can be summarized in six words. the Paul-Elder model is more difficult for teachers to navigate. This limits its ability to be adopted more widely. Nevertheless, this complexity does hold out the promise for a more meaningful mapping of digital tools to thinking routines in the classroom. If a teacher were to say that they were using blogs to explore Fairness applied to Points of View to develop Intellectual Empathy, one can appreciate that the model is leading to a clearer notion of how digital tools can be used to sharpen critical thinking in the classroom.

Semantic Waves
10Another way of looking at the problem is to try to drill down to how meaning is constructed and deconstructed in the classroom. A new framework (Semantic Waves) for thinking about knowledge practices in the classroom, derived from the work of Basil Bernstein and Pierre Bourdieu allows us to bring powerful concepts to bear on semantic practices in the classroom. Maton (2014) has described how the concepts of semantic gravity and semantic density can be used to describe pedagogical practice in ways which allow us to think about the critical thinking implicated in classroom talk.

Semantic waves are descriptions over time of the relative semantic gravity or density of the ideas contained in classroom talk or student essays. Semantic Gravity refers to how concrete or how abstract an idea is, and is represented as SG+ a very concrete, grounded, contextualized idea, or SG- a very abstract, rarified concept, and of course all points in between. The word Revolution in History, for example, is an abstract idea, relatively free of particular contexts. A particular incident from the Russian Revolution, however, is more contextualized and concrete. One thing that teachers tend to do is to take abstract ideas (SG-) and help explain and contextualize those ideas by giving examples and instances (SG+), they help unpack concepts so that students can understand them better. They then help students take more concrete instances and everyday knowledge, and package in terms of the more academic language and understandings of the discipline they are studying, as shown in the diagram.

Semantic Density refers to how condensed an idea is. A symbol or metaphor conveys far denser meaning (SD+) than the everyday meanings of words (SD-). Poetry, for example is generally more dense than prose.

waving-not-drowning-7-638From the idea of the semantic wave, or how semantic gravity and density changes over time, Maton has described semantic profiles, or typical scenarios. Often discussion, or a student essay will remain generalised and abstract, never exploring examples, supporting evidence or anecdote to develop an idea or argument. This represents a high semantic flatline, as shown in the illustration. Often the discussion will remain at a concrete level, without any conclusions being drawn. This is a low semantic flatline. More usual in any kind of constructive meaning making is a much wider range and flow between abstraction and the concrete as arguments are made and supported by evidence. Seeing critical thinking in terms of creating semantic profiles opens up new ways of looking at both ICT usage in the classroom, something which I explored in my own research (Love, 2016), and how Thinking Strategies offer pedagogical affordances for meaning making – see the video below, which is an idea which needs to be explored.

I believe that the Semantic Wave framework offers a way of understanding how pedagogical approaches and technologies afford the construction and deconstruction of meaning in the classroom in detailed and powerful ways. It is, however, under-researched and must remain somewhat tentative at this stage. It represents both a pedagogy in its own right and a research framework. The ideas are somewhat abstract and may be off-putting to many teachers. To me as a teacher, the framework instantly made sense, but it is an idea that needs some explaining!

 

Putting it together

The three frameworks discussed all represent somewhat different ways of approaching critical thinking in the classroom, all with strengths and weaknesses. In many ways there needs to be synthesis of all three types of approaches to create a model which both explains and informs practice; allows for critical thinking learning objectives to be realised, and for tools and pedagogies to be integrated within any particular lesson.

In the next blog post I will try to unpack how I believe this might be achieved and to begin to suggest a tentative framework which meets these requirements.

 

Bibliography

Anderson, L.W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., Wittrock, M.C. (2001). A Taxonomy for Learning, Teaching, and Assessing: A revision of Bloom’s Taxonomy of Educational Objectives. New York: Pearson, Allyn & Bacon.

Bloom, B.S. (Ed.). Engelhart, M.D., Furst, E.J., Hill, W.H., Krathwohl, D.R. (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.

Love, D. A. S. (2016). Any Tool Works If You Are Using The Language: The Role of Knowledge in ICT integration in a Johannesburg private school (Masters dissertation, School of Education, Faculty of Humanities, University of the Witwatersrand, Johannesburg).

Maton, K. (2014). Building powerful knowledge: The significance of semantic waves. In Knowledge and the Future of the Curriculum (pp. 181-197). Palgrave Macmillan UK.

Scriven, M & Paul, R, (2003), Defining Critical Thinking, http://www.criticalthinking.org/pages/defining-critical-thinking/410, accessed 12/12/2016.

 

 

Hack Your Life!

tdLearning to Code is all the rage right now, but how to implement a coding for all programme is not as easy as it sounds. There are a number of decisions which need to be taken. The first decision is around whether it will be merely open to all, or compulsory for all. At my school we expose all the students to a little bit of coding in their computer skills classes, in the form of some Scratch and some JavaScript. I also do a Game Design unit using Flash, with a little bit of Action Script. It works in that everyone can meet the requirements, but not all students embrace coding enthusiastically, so there might be a great deal to say for going the extra-curricular route, or maybe both.

I suspect that both is the right answer for most contexts. Everyone needs to be exposed to some coding, but I’m not convinced everyone can handle a full-on programme. the second decision is what programming language to use.

I started teaching some coding back in the late 1990s, with some Logo, and then quickly moved to JavaScript. The big advantage of JavaScript is that you do not need a compiler. All you need is a browser and a web editor. Currently I get my grade 9s to use JavaScript to create a quiz which will tell the user if they are right or wrong, and tally a score. When they create Flash games, I teach them to use AS3 to create drag and drops, and how to use tutorials to learn more skills. However, with many new interfaces for creating mobile apps appearing, my gut feeling is that this is the way to go, and I am probably going this route this year. MIT has a platform for creating apps, but there are so many popping up, I haven’t been able to research them all.

Here’s the introductory video for the MIT App platform to give you an idea of how it works.

The third decision is around how to build enthusiasm. Some students will enter into it with gusto and there is so much available online that they will be able to teach themselves. But getting the social aspect to work is vital to any programme having staying power. If you can meet face to face that is the best option, but in many schools the normal sporting and extra-curricular programme is so full, finding a mutually agreeable time is well-nigh impossible. I have tried running a virtual club, but the buy-in is limited. Special programmes which run for a limited period of time, such as a Hackathon or Hack Off may work better if you can find a niche in the calendar. I have been trying to shoe-horn some coding time into whole school programmes such as Cross-curricular tasks, or end of year programmes when teachers are marking exams and willing to sacrifice curriculum time, but for some reason staff meetings tend to resist the idea as soon as you mention coding, or even worse, hacking. For students the word hacking has a much more positive valency, however.

I honestly don’t know what the answer is, to all three questions, but I do sense that this year the zeitgeist is different. The idea that everyone should code is so out there, I think it may just take hold!

 

 

 
 
%d bloggers like this: