RSS

Category Archives: Thinking Digitally

Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Thinking Digitally Logo

What makes for good thinking? It is commonly assumed that it is our intelligence or abilities that have the greatest effect on our thinking. Ron Ritchhart (2002), however, makes the case that it is our dispositions, our acquired patterns of behaviour which activate and direct our abilities . More than our abilities themselves, our dispositions afford our capacity for greatness. These dispositions are shaped and formed by our interactions with others, by our culture, society and history (Ritchhart & Perkins, 1997). Dispositions, or habits of mind (Dewey, 2007) are pivotal because they determine when we choose to think, the importance we place on it, and how deeply we question the assumptions and narratives (Bruner, 1991) upon which thought is based, allowing new narratives to emerge.

For example, the habit of mind persisting suggests a disposition towards keeping at it: trying new ways to approach a problem, not being daunted by failure, worrying away at something until you succeed. This disposition applies in many different contexts, including digital spheres, such as computer programming. It is clearly a disposition which affords success. You cannot succeed if you give up too easily!

homLikewise the capacity for finding humour affords success in many different contexts. Costa & Kallick (2008) have identified sixteen such habits of successful thinkers. These habits do not describe the knowledge or skills necessary for success, but the dispositions which enable successful thought and action in the world.

In this article I will be arguing that Thinking Digitally is just such a disposition; a habit of mind which predisposes those who demonstrate it towards success in the same way that persistence or finding humour does. It is a disposition towards using digital tools to aid thought and action in the world; towards deploying algorithmic thinking as a way of generating explanations of reality, and using computational thinking as an approach to problem solving. These ideas will be unpacked below, but it is important to note that they share a concern with the disposition towards using digital devices to create, solve problems and understand the world we live in. I have chosen to use the term Thinking Digitally rather than computational thinking, which is the more commonly used term, because I believe that it describes a wider range of uses of digital devices, beyond how we organise tasks that are going to be automated by machines, mathematical modelling, the use of simulations, and simulators, or calculators and computers to include the ways in which we access and evaluate information, and how we author digital texts and communicate our understandings in the current era.

I will argue that Thinking Digitally is a logically separate habit or disposition rather than the application of any combination of the other sixteen habits in a new context, the digital world. Marc Prensky (2001b) has advanced the idea that people born after a certain date, whom he has dubbed digital natives, carry a natural disposition towards digital technologies. He argues that people born before the advent of the digital era are digital immigrants; that digital technologies are somehow foreign to them, and that while digital natives are able to use digital devices effortlessly, for immigrants there is no such natural facility. In other words digital dispositions are based on age. While this idea has gained widespread traction because it seems to explain why some people seem to take to digital technologies like a duck to water, and others struggle with it, or avoid it entirely, it is a notion that has been roundly critiqued, as we shall see below. I shall argue for the notion of a set of digital dispositions, based on certain characteristics or habits of mind rather than on age, carries weight.

In recent years, within the Computer Science community, teachers have begun to argue that Computational Thinking, often called Algorithmic Thinking should be integrated into the curriculum, not just as a separate discipline, but as a foundational subject. I will argue that Computational Thinking and Algorithmic Thinking forms a new mode of thought, different from the paradigmatic or narrative modes of thought, and that the predisposition towards using this mode of thought forms the basis for arguing in favour of a separate habit of mind. I will argue that a disposition towards using computational and algorithmic thought, and allied to this using digital media to access and legitimate knowledge, forms part of an evolving revolution in thought, which is transforming the world, and towards which some people appear to be more disposed, and others less disposed. Acquiring the habit of digital thinking clearly advantages some, and leaves others floundering in an increasingly digital world.

If we look at recent ideas around epistemology, theories about learning and cognition, pedagogy and literacy, we can begin to discern polar opposites which, while they do not define neat categorizations of analogue and digital thought, do point to ways in which certain polarities are afforded by either analogue or digital modalities. We are living in an Information Age (Castells, 2011), in which the digital storage and processing of information in digital forms is increasingly central to our knowledge and economic survival, and this privileges certain ways of looking at and thinking about the world. Our ability to succeed in this altered landscape is largely dependent upon our dispositions and habits of mind, not just on our knowledge or skills. Some of these dispositions have already been described by Costa and Kallick, but some are unique to digital contexts, and revolve around how we think about our relationship with digital cognitive technologies.

Khaneman (2011) has argued that there are two types of thinking which operate. System I thought is fast, intuitive thinking in which we come to rapid assessments based on our accumulated habits of thought, and received judgements handed down to us. System II thought, on the other hand, is slow thinking in which we carefully reason through a position, considering the evidence and reaching a judgement based on a logical process. We need to rely on System I thought most of the time. We cannot tell life to wait while we carefully reason over a position. System II thought is, however, vital because if allows us to build better instincts, and base our intuitive responses on more solid ground. In other words, it frames our habits and dispositions. Successful critical thinking is based on building solid habits of mind on which we can depend when we don’t have the time to carefully consider our responses.

In much the same way, I want to make the case that the salient feature of our use of digital technologies lies not in the skills and abilities we bring to any given task, but in the dispositions and habits of mind that unlock what we are able to accomplish with technology. Behind digital literacy lies, in other words, a set of mental habits and dispositions which are what allow the power of the technology to be unleashed. My capacity to use a database is one thing, but it is my digital dispositions which determine when I choose to use a database, for what purposes, and to what effect. Their dispositions are what allow some people to use technology to unlock their creativity and solve problems rapidly and effectively, and what constrains others to use it merely for repetitive or largely meaningless tasks, or not to use it at all.

What are the dispositions and habits of mind of successful people when they are using digital devices and media? Do these dispositions form a logically discrete entity or are digital technologies simply a different context in which the habits and dispositions (Costa & Kallick, 2008) that apply to analogue thinking are applied?

In working through this argument I am going to make a number of moves which are intended to present the case that there are indeed a set of uniquely digital dispositions, and that it makes sense to speak about thinking digitally as an overarching approach that successful thinkers bring to their use of technology, which shapes and guides effective digital practice. Indeed I will argue that these digital dispositions may also be brought to analogue contexts. In other words, we human beings, in learning how to think effectively when using digital tools, may have acquired some dispositions which we can carry back into the real, analogue world.

Computational Thinking

computational-thinking-white-bg

The first move is to argue that one set of digital dispositions hinges around the thinking that we do when we interface with machines. A frequently used term is Computational Thinking (Wing, 2006), which is used to describe the “thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information processing agent” (Lee, 2016). It is essentially taken to mean, then, the thought processes of one who is using a machine to solve a problem, and it is normally constituted as being based on three cognitive pillars: abstraction, automation and analysis. When we use cognitive technologies to automate or perform tasks, we need to program the machine, we need to break the task down into sub-routines that can then be automated, and we need to think the task through, understanding it from the point of view of what it is that the machine can and can’t do. The illustration above (“Reading A Book or An Hour of Code?,” 2016) is typical of attempts to break down what constitutes computational thought.

Proponents of Computational Thinking in the curriculum argue that “algorithmic thinking” should form the fourth R of the curriculum: Reading, ‘Riting, ‘Rithmatic & ‘Rithm (Grover & Pea, 2013). The two terms appear to be used interchangeably. What then is Algorithmic Thinking?

Algorithmic Thinking vs Narrative & Paradigmatic Thought

paradigmatic narrative

Bruner (1991) sets out a dichotomy between two modalities of thought: the narrative and the paradigmatic. The paradigmatic mode, the scientific mode, seeks to logically categorize the world and test hypotheses of what is. The narrative mode, on the other hand, is concerned with the meaning that is ascribed to lived experience. When explaining the world, why an apple falls to the ground, for example, one could use paradigmatic thinking, appealing to a concept, the theory of gravity. But one could also use the narrative mode to explain why an apple falls to the ground: the apple falls because I knocked it with my elbow, and it rolled off the table and fell to the ground. There is nothing necessarily superior in either explanation. Both have explanatory power.

The advent of complex calculating machines, however, has furnished us with new ways to go about constructing an explanation of the world. Computer modelling offers ways of constructing a model of how tables, elbows and apples might behave under different sets of circumstances. Using a computer, and inputting slightly different sets of data and observing the outcomes can generate a more complete understanding of how apples roll and fall and allow us to make more accurate predictions. The sheer number of calculations necessary to sustain this mode of thought was largely impossible before the advent of computers, but it allows us to predict weather, and find distant planets.

paradigmatic narrative algorithmic

Those who have advanced the idea of an algorithmic mode of thought would be hard pressed as to where to place this in Bruner’s schema. Algorithmic thinking is not necessarily about categorizing the world, or logically predicting outcomes, nor is it about narrating an explanation, although it may involve both these things. It shares with paradigmatic thought a concern with establishing procedures, but shares with narrative thought a concern with sequencing. It should probably then been seen as a separate mode of thought. In the diagram on the right, I have represented this by placing it off to one side.

But, as we noted, there are considerable overlaps, so it would probably be better to represent this schema as a series of overlapping circles. This schema has the advantage of allowing for greater nuances. Bruner himself, for example shows how narrative thought lies behind much of scientific explanation.

paradigmatic narrative algorithmic 2

Behind the algorithmic mode of thought lies mathematical modelling of the world and the methodology of simulation. Mathematical modelling, enabled by the power of computers represents powerful new ways of acquiring knowledge of the world. Evolutionary Engineering, for example seeks to use comparisons between random designs to reach more efficient designs in the fewest number of steps. Rather than seek to design complex solutions from principle (paradigmatic view), by comparing one design to another, and selecting the better one, repeating this process, optimum designs can be achieved in relatively shorter periods of time.

Cognitive Offloading & Cognitive Technologies

Because we have such small working memories (Sweller, 1988) we constantly need to offload our cognition onto the environment. We use language and writing to achieve most of this, but we also use our fingers when counting, or devices such as calculators, calendars, books, and increasingly these days digital devices such as cell phones, computers or the Internet. Dror (2007) argues that digital tools have become so pervasive and integral to our thinking processes that they cease to be external aids and become internal components in our cognitive processes. Cognitive technologies, in other words, have become instrumental in shaping our cognitive processes. New forms of distributed cognition (Clark & Chalmers, 2010) have arisen, in which our thinking is distributed across machines and across other people (Dror & Harnad, 2008), radically altering our cognitive landscapes.

These new theories about cognition and knowledge, encompassed within what is known as the Second Cognitive Revolution in psychology (Miller, 2003) clearly represent radical new ways of looking at what it means to be human, and how we see the relationship between man and machine.

Computational Thinking & Digital Thinking

Jeanette Wing, who introduced the term Computational Thinking, described it as “a universally applicable attitude and skill set everyone, not just computer scientists, would be eager to learn and use” (Wing, 2006). Nevertheless, and despite a growing receptivity to the idea that Computer Science should be considered a core subject in the curriculum (Grover & Pea, 2013), and despite the growing integration of ICTs in our lives, and educational system, Computational Thinking is still seen largely as part of the Computer Science curriculum rather than a mode, or disposition of thought which pervades all subject disciplines. Lee argues that there is a tendency to conflate Computational Thinking with the operational steps involved, with computer literacy or with information literacy (Lee, 2016). This misses the point that Computational Thinking is in essence the dispositions of thought that human beings use when they are using machines to automate thinking process, or doing cognitive offloading onto machines.

I think part of the difficulty lies in the use of the term computational. As Wing stresses, the aim of Computational Thinking is not to get human beings to think like computers, but to describe how human beings think when they are using machines to solve problems. The term computational is so firmly rooted in the mathematical and engineering sciences that it tends to preclude the use of computers for perhaps their most vital task, that of communicating and accessing knowledge. How knowledge is legitimated and disseminated lies at the centre of our cognitive identities, and the new digital technologies are in the process of transforming the rules of the game. I believe that it is more useful to speak of how human beings organise their thought processes around problem solving using machines as digital thinking rather than computational thinking. This semantic change allows us to conceptualise computers as communication as well as computational devices and to see the dispositions of thought as including both purposes.

Digital technologies have transformed our practices in any number of ways, but it is important to note that every shift in practice represents a mental shift as well. It changes how we approach problem solving in subtle but important ways.

Digital Thinking vs Analogue Thinking

The second move is to claim that human thought is undergoing something of a revolution, and that particular aspects of our cognition are being accentuated and transformed by the rapid advances in digital media and what this affords. These changes are sufficiently momentous, and the contours of such change sufficiently discernible that we can begin to talk about a difference between digital and analogue thinking. These differences are not absolute; we have not acquired new brains overnight, but they are polarities which are impacting on our cognitive lives. I believe that those aspects of our thought which are afforded by digital technologies have been growing in importance, in line with the development of the information economy and the new world order, which has altered our discursive landscape (Gee & Lankshear, 2006) and may well be altering our brains as well. Neuroplasticity ensures that changes in practice have an impact on our brains (Prensky, 2001b), and it must be assumed, our cognition also.

analogue v digital

That we are experiencing change is not in contention, but how to characterise that change is. There are perhaps two possible views on this. The first, what we might term the hard view, is that Digital Thinking represents a new mode of thought afforded by the new digital media. This modality of thought has significant points of difference to Analogue Thinking to warrant identifying a range of habits and dispositions of mind which signal successful practice. This view sees cognition as being revolutionised. Stevan Harnad (1991), for example, talks of the Fourth Cognitive Revolution, as described below, and this has obvious ramifications for our behaviour and habits. The soft view, on the other hand, sees cognition as being largely unaltered by digital media, but that new approaches and dispositions are needed for dealing with the differences inherent in using digital media. The illustration above attempts to characterise two polarities of analogue and digital thinking as cutting across the different modalities of thought we discussed previously (Paradigmatic, Narrative and Algorithmic thought).

I do not believe that we can assess either of these views, whether cognition itself is changing, or whether our habits of mind are changing does not essentially alter the schema which can be discerned emerging from the literature around digital literacies, cognitive psychology and computational thinking. In either event it makes sense to differentiate the contours of how we perceive the terrain as altered by the introduction of digital technologies.

In an always-on, always-connected world, as more and more devices become connected in the Internet of things, and as connectivity becomes more commonplace and more integral to our lives, many are asking questions about what this means for our cognitive development. Headlines scream that the Internet is making us dumb (Carr, 2011; Thompson, 2013) and neuroscientsists suggest that our brains are being re-wired (Siemens, 2014; Greenfield, 2014) for good or ill. Psychologists suggest that it is affecting our memories (Sparrow, Liu, & Wegner, 2011) or that it is affecting our social lives (Wooley, 2013). There is a sense in which we stand at a crossroads, uncertain of both the direction and meaning of the rapid changes which are shaping our society. Utopian dreams and dystopian fears permeate the debate around the extent to which technology is driving change. Advances in Artificial Intelligence raise the prospect that we will soon not be the only sentient beings on the planet, and usher in a Brave New World where the difference between Human and Machine may not be that clear cut.

I would argue that never before have we needed a roadmap for navigating the future as much as we do now. Moore’s Law (Gibbs, 2015) which states that the capacity for data storage will double every 18 months, while applying to data, is a metaphor for the rapidity of change, and expresses the sense in which the rate of change is exponential. I would argue that what is required is a new set of habits and dispositions, strategies for dealing with this change, and with the changes associated with the Internet revolution. Our old ways of thinking, the habits and dispositions of centuries may not be enough to guide us in the future.

The advent of the printing press saw the introduction of a medium of mass communication which has had far-reaching implications for our history, society, and thought. When books were only available to a select few, and knowledge dependent upon individual transmission, knowledge was hierarchical and legitimated by an appeal to authority alone. In the thirteenth century Peter Abelard initiated a revolution in knowledge (Campbell, 1991), an appeal to evidentiary support in challenge to authority, where even theology had to bow before reason, but it was only with the invention of the printing press, and the Industrial Revolution allowing for the mass production of books that knowledge could be distributed more widely, ushering in the ideological sea change represented by the Enlightenment. How knowledge is legitimated, disseminated and reproduced changes completely how individuals think, learn, create and solve problems.

I would argue that we are at a similar turning point represented by the new digital media. The ability to self-publish, afforded by the Internet, has changed how knowledge is disseminated, altering legitimation practices. Many academics publish material from their blogs, and the speed of publication ensures peer review, but not in the conventional sense. An example of the future is wikipedia, that repository of crowdsourced knowledge said to be as accurate as Encyclopedia Britannica. Stevan Harnad’s notion of academic “sky-writing” encapsulates the idea of the Internet affording the immediacy of oracy with the reflective power of literacy to bring about the Fourth Cognitive Revolution. Harnad’s (1991) schema is tabulated below.

First Cognitive Revolution Oracy Hundreds of thousands of years ago Communication immediate and interactive Growth of Wernicke’s area and Broca’s area in brain?
Second Cognitive Revolution Literacy Tens of thousands of years ago Allows for reflection between communications Growth of Exner’s area in brain?
Third Cognitive Revolution Printing Press c. 1450 AD Revolution of scale
Fourth Cognitive Revolution Internet c. 1990s AD Near synchronous communication: brings together immediacy and reflection Are our brains being rewired?

This periodisation carries with it a sense of cognitive affordance offered by the new digital media of near synchronous communication. Ideas can be disseminated rapidly, enabling quick responses and affording collaborative thinking, but with enough time for some reflection before responses are made. The last column, linked to possible neural developments is highly speculative, and Harnad himself makes only tentative suggestions in this regard, but feeds into a contemporary literature suggesting that the move from page to screen carries with it some neural rewiring.

News and opinions are shared rapidly across social media, replacing traditional news sources. In a world where breaking news is not mediated by the professional journalistic eye, but blogged, tweeted and facebooked by participants, we need to reassess how we access knowledge about the world. Content is similarly mediated by new mechanisms. In a world where opinions are discussed through the vehicle of the share, the like and the Internet meme, we need to see knowledge as a commodity that is rapidly traded across the globe in relatively unmediated ways.

I think we can begin to discern the basis for an argument that we are seeing a revolution in cognition, a shift away from analogue, towards digital modes of communication and cognition.

Digital media do not replace analogue channels. As a species we still rely on our senses. Nevertheless, digital media represented by Internet and other telecommunication technologies, represent new modes of communication which have far-reaching consequences for how we communicate, think, learn, create and solve problems.

How our thinking is mediated, and the types of cognitive offloading that are afforded by these technologies may be represented by the following chart which compares Digital and Analogue Contexts:

Thinking Digitally Analogue Thinking Discussion
Hyperspace The five senses Push technologies in particular afford the delivery of information from sources not limited to our biological five senses.

 

For example I learn what the weather is outside from my cell phone as opposed to looking out the window.

Fast paced Slow paced Acquiring data in an analogue world is slow and deliberate, gathering data in a digital world is almost immediate and less considered. Speed has advantages and disadvantages and calls for different approaches to evaluating data.

 

I can use a search engine to find out about something rather than going to a public library and finding a book on the subject.

Multi-channel (digital media streams) Single channel (biological senses) While data in the analogue world comes at us via the multiple channels of our senses, the digital world opens up even more channels, some of them referencing analogue channels, others entirely in cyberspace. This has advantages and disadvantages and calls for different approaches towards data collection.

As I type this, I am receiving data from my cell phone, emails and alerts are appearing in my inbox and I am able to use the Internet to search for information. This is on top of what I sense about the world around me through my biological senses, which tends towards a single channel (my senses).

Parallel Processing Serial Processing The speed at which different channels of communication can be accessed affords multi-tasking. Of course, given the severe limitations imposed on our working memory, we can never truly multi-task, we can certainly take on tasks simultaneously in the analogue world. However, the plethora of extra channels extends our ability to work on many levels at the same time.

I tend to take on more tasks in parallel while using digital media. Partly because it is more efficient to do so, but partly because my mind is parcelling out sub-tasks which will require time because they are asynchronous communications – for example posting a query for help in solving a problem on a forum. While I wait for a reply, I write an email and watch a Kitty video on YouTube.

Hypertextual Screen Reading Linear/narrative Page Reading Readings in hyperspace are hypertextual rather than linear or narrative. Hypertextual reading affords rapid synthesis and evaluation. Linear, narrative readings afford a closely argued train of thought.

Using a website I can rapidly skim for relevant information by following links and using the Find or search function. If I want to follow a train of thought I have to read sequentially.

Fast and Wide Thought Slow and Deep Thought Multi-tasking and hypertextual reading affords the rapid synthesis and evaluation of ideas, the identification of patterns and Fast Thought. Linear and narrative readings affords closely reasoned consideration of a train of thought.

Most decisions we make are the result of Fast Thought, but digital media afford this by making more information available and allowing us to find patterns more actively. Page reading, however, affords deep thought because I have to follow the sequence of words in the order they were written.

The Reader Creates The Text The Writer Creates The Text Hypertext is epistemologically predisposed towards a constructivist paradigm where the reader constructs their own meaning from the world around them. Texts are created by the reader, following links.

Text is predicated on an author communicating and transferring a message, and is epistemologically neutral. The text is created by the writer.

Constructivist Pedagogical Paradigm Instructivist Pedagogical Paradigm Theories about how we think and learn have changed considerably over the last hundred years or so, with a shift towards theories stressing how meaning is constructed rather than perceived.
Learning by participation Learning by acquisition Sfard (1998) has argued that there are two metaphors for learning and both are valid – learning by participation, and learning as acquisition.
Symbolic Manipulation Symbolic Creation Fast thought affords rapid symbol manipulation, while slow thought affords the creation of new symbols.

Digital media allow us to rapidly manipulate information in many forms, but page reading allows us to generate new concepts.

Computational (Digital)

(Algorithmic) Thinking

Narrative & Metaphoric Thinking Jerome Bruner has shown the importance of narrative in our cognition. If we think about an issue we tend to do so by relating it. We also think metaphorically, comparing things to something similar in our experience. Computational thinking is something different – it operates at the level of simulation and modelling.

Digital media allow us to create simulations by modelling a problem and playing it out rather than thinking through a solution by relating its contours. This produces unexpected and counterintuitive results.

This schema is presented as polarities of what must inevitably form a continuum. Nor do I believe that they form a rigid dichotomy. Far more accurate to say that there is a dialectical relationship between the two.

The Foundations Of Digital Thinking

This characterization of twin polarities, operating in a dialectical (dialogic) relationship rather than in opposition, allows us to argue that there are several elements in human thought which, if not new, are at least being emphasised and foregrounded by the growing centrality of computers and ICTs in our lives. If we accept that even more important than our abilities and capacities are our dispositions, our ingrained behaviours and responses to the problems and challenges we face, it becomes clear that how we react to this shift in technology from analogue to digital is crucial in what we can accomplish.

What elements then constitute the dispositions of successful users of computational and digital media? If we look at successful use of digital media I think we can start to identify a number of elements.

Embracing Change

We all know that some people seem to embrace change and use it to their advantage, while others appear phased by change and shy away from it, or appear less imaginative and productive in the ways in which they use it. This has been falsely framed as a generational divide (Prensky, 2001a); that those born after 1985 (digital natives) are somehow wired differently, and have a natural affinity with ICTs which older people (digital immigrants) lack. While our brains may well be wired differently, if I am following this debate correctly, neural plasticity implies that all users of new technologies, regardless of age, will experience changes. There is a great deal of research indicating that this notion of a differently wired, differently thinking generation is a myth (Kirschner & van Merriënboer, 2013). There is no evidence to suggest, for example, that a new generation of youngsters are predisposed, somehow, to self-directed, self-paced, collaborative learning because their brains are somehow wired this way.

The divide is really one of disposition. Some youngsters are “immigrants” as much as some older people, and some older people are completely at home with technology. Exposure to, and comfort with a technology does not necessarily imply a capacity to use the technology in the most beneficial ways, in and of itself. For example, technology may afford self-directed learning, but it is one’s dispositions which determine whether or not these opportunities will be taken up. These dispositions generally need to be taught, and herein lies the central paradox of education. Discovery learning, as attractive as it sounds is ultimately a deeply flawed idea because of the scholar’s dilemma; how can you discover something until you know it is there to be discovered?

For example. Our brains, constrained by cognitive load, limit the ability to multi-task (Kirschner & van Merriënboer, 2013; Sweller, 1988). Our dispositions and habits, however, may allow us to handle multiple tasks with greater facility, overcoming the limitations of our biology to some extent.

Successful thinkers tend to embrace change rather than balk at it. Thinking Digitally means that you are constantly ready to adapt to change, and see it as an opportunity rather than a constraint.

Production vs Consumption

They say a picture is worth a thousand words. There is all the difference in the world between creating and editing pictures and viewing them. It is not, however, that the one is good and the other bad. There are skills and sensibilities, dispositions and capacities which are involved in both activities. To read a picture effectively requires a thorough understanding of the genres and contexts in which the picture was taken. Producing a picture also requires skills and abilities, understandings and literacies. I would argue then that we cannot privilege authorship above criticism. Both have value and a place. We are all at some stage or another a producer or a consumer of both analogue and digital technologies and products.

Nevertheless, there are good habits and dispositions around production, and likewise for consumption. These dispositions are likely to be very similar in analogue and digital contexts. Having said this, however, we need to highlight one vital difference.

All technologies require knowledge and skills both around production and consumption. But some technologies are clearly more complex than others and require specialised skills and knowledge. While almost everybody learns to read and write, very few learn to write software code. While most people can use a screwdriver, relatively fewer can use a soldering iron. This complexity factor limits and constrains the use of all technology. But digital technology represents a level of complexity which makes this division between people who can and people who can’t more stark. It is relatively rare to hear someone say of any analogue technology that it is simply beyond them and they cannot use it in any way. And yet this attitude towards digital technology is frequently heard.

A person who is relatively handy can tinker with, and fix a range of analogue technologies without ever being an expert in any of them. And yet, increasingly this tinkering is not possible with digital technologies as the technology requires such specialised skills and knowledge that the ordinary “handyman” cannot fix a problem.

I am no motor mechanic, but even I have been able to get a Volkswagen Beetle motor running again with a little logic, a prayer, and a spanner. There is no way that I could do the same with modern cars with their electronic complexity and sealed units. The growing complexity of our machines threatens to turn us all into helpless consumers, and robs us of any ability to tweak and tinker which an essential part of being an effective producer.

Personal computers were first introduced very much within a tinkering culture; the first computers were shipped with no applications – users had to create their own programs. These days the reverse is true – computers and devices are loaded with a wealth of applications, and programming is both unnecessary and discouraged. Production within each application is further consumerized by a move towards authorship using templates and wizards. Computing has moved from a strong producer ethos to one which encourages consumption. The rapid growth in computer usage explains this shift, but I would argue that as the user base expands so does the need to actively encourage producer dispositions, which are being lost by the ease of consumption.

Growing social complexity and knowledge specialization means that increasingly we are reliant on others to produce and fix, what formerly we could tinker with ourselves. This leaves us often helpless. While knowledge and skills specialization accounts for some of this movement from producer to consumer cultures, much of it is driven by underlying habits and dispositions. We can see in the Maker Movement (“Maker culture – Wikipedia, the free encyclopedia,” 2016) and in the thinking behind Computational Thinking, a growing reaction to this trend. As analogue technologies become increasingly digitized, I would argue that the need to foster producer mentalities and dispositions should be seen as a core aspect of Digital Thinking.

Successful thinkers display dispositions towards producer mentalities. Thinking Digitally means that you are able to Hack Your Life.

Creativity vs Drudgery

We face a future in which Artificial Intelligence will transform all aspects of our economy and society (Cellan-Jones, 2014). While we do not know what the future holds in store, I would argue that we need to see the growing technologization of society as an opportunity, and need to emphasise a disposition towards creativity rather than drudgery. Technology may be used to free up our lives for leisure and creativity, or for drudgery and work. Andrew Feenberg (1991) described the central contradictions between the potentialities for control and democratization inherent in technology, and pivotal to speculation over the role of technology has been dystopian and utopian visions of the future linked largely to these polarities.

Do we face a future in which a few use technology to control and pacify the many, in which humanity at large is debased by a digital divide in which they are passed over, or will technology free the majority from lives of drudgery to unlock their full potential? This is very much a political choice, and depends upon our social organization. The sharp debates around the politics of Austerity reveal the contours of the choices we have to make within the new world order, whether to accept Neo-Liberal Taylorist arguments about the nature of Capital, or to seek Socialist alternatives and a more equal distribution of wealth. The stark choice is whether we valorize capital above labour, or see technology as an opportunity to unlock human potential in ways not previously imaginable.

We urgently need a disposition towards creativity over drudgery in the ways in which we approach our use of technology.

Jane McGonigal (2011) has written about what she calls the four superpowers of games, and how these powers may make a better world. These superpowers form perhaps the dispositions of successful gamers. McGonigal argues they are the crucial dispositions for life itself. They are:

  1. Urgent Optimism
  2. Blissful productivity
  3. Social Fabric
  4. Epic Meaning

Her characterization of gaming as empowering in gamers a hopeful expectation of success in conjunction with productive work within a social network to accomplish meaningful tasks expresses a disposition towards optimistic, humanistic outcomes which resonates with a view of human purpose which is vastly at odds with the Malthusian vision of humanity as mere cogs in a machine.

Successful thinkers display dispositions towards creativity and unlocking their human potential. This requires constantly finding new ways to express yourself through technology, rather than allowing it to be used to marginalise your humanity. Thinking Digitally means that you are alive to the human within the machine.

Conclusion

In this exploratory paper I have attempted to argue that the most crucial element of our emerging relationship with technology is the dispositions and habits of mind we bring to our use of machines to automate and extend our thinking. The digital divide is often conceptualised as a divide around access to devices, but in reality it is a mental divide between those whose dispositions empower human creativity and problem solving, and those whose habits of mind limit the realm of the possible.

Bibliography

Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Castells, M. (2011). The Rise of the Network Society: The Information Age: Economy, Society, and Culture. John Wiley & Sons.

Cellan-Jones, R. (2014). Stephen Hawking warns artificial intelligence could end mankind – BBC News. Retrieved April 8, 2016, from http://www.bbc.com/news/technology-30290540

Clark, A., & Chalmers, D. J. (2010). The Extended Mind. In The Extended Mind (pp. 26–42).

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Dror, I. E. (2007). Cognitive Technologies and the Pragmatics of Cognition. John Benjamins Publishing.

Dror, I. E., & Harnad, S. R. (2008). Cognition Distributed: How Cognitive Technology Extends Our Minds. John Benjamins Publishing.

Feenberg, A. (1991). Critical theory of technology. Oxford University Press, USA.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Kahneman, D. (2011). Thinking, Fast and Slow. Penguin UK.

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do Learners Really Know Best? Urban Legends in Education. Educational Psychologist, 48(3), 169–183.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Maker culture – Wikipedia, the free encyclopedia. (2016, March). Retrieved April 8, 2016, from https://en.wikipedia.org/wiki/Maker_culture

McGonigal, J. (2011). Reality is Broken: Why Games Make Us Better and How They Can Change the World. Random House.

Miller, G. A. (2003). The cognitive revolution: a historical perspective. Trends in Cognitive Sciences, 7(3), 141–144.

Prensky, M. (2001a). Digital Natives, Digital Immigrants: A New Way to Look at Ourselves and Our Kids.

Prensky, M. (2001b). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Reading A Book or An Hour of Code? (2016, March 22). Retrieved May 10, 2016, from https://lh3.googleusercontent.com/-bQWsEynX-8A/VgGCHw2AdrI/AAAAAAABWTU/_y9VVvBQYgM/w1728-h1728/computational-thinking-white-bg.jpg

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A. (1998). On Two Metaphors for Learning and the Dangers of Choosing Just One. Educational Researcher , 27(2), 4.

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

(Bruner 1991)Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Prensky, M. (2001). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A., & Anna, S. (2014). Metaphors for Learning. In Encyclopedia of Science Education (pp. 1–4).

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

 
Comments Off on Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Posted by on December 6, 2019 in Computational Thinking, Critical thinking, Habits Of Mind, Learning Theories, Pedagogy, Thinking Digitally, Thinking Skills

 

Teaching ICTs – Computational Thinking Pedagogies & Thinking Strategies

As more and more schools start teaching computing explicitly in one form or another to all students, the focus moves from advocacy, getting computing for all into schools, towards pedagogy – how best to teach it. There are a number of pedagogical approaches and cognitive strategies that have been promoted, but I would like to look at just one which I have found effective over the years. I think it would be fair to say that many teachers of ICTs are self-taught, and perhaps for this reason, the discovery method is still remarkably popular. The influence of Seymour Papert’s Constructionism, allied with Constructivist learning principles has made direct instruction far less prevalent in the computer class than in perhaps any other classroom. Nevertheless, a lack of direct instruction, I believe, threatens to undo many of the benefits of a discovery learning framework.

Discovery learning is predicated on setting problems which will allow students to learn through grappling with trying to find solutions, drawing on their past learning and knowledge. But there is the danger that no learning will take place at all if there is no scaffolding of the discovery process, and ironically, the greater the reliance on student-centered learning, the greater the need for directed teacher interventions. The greatest weakness of the discovery method is encapsulated in the scholar’s dilemma. How do you discover something that you don’t know exists? You need a more experienced other to, at the very least, nudge you in the right direction.

Teachers who use discovery learning, need to be careful to make sure that students have the tools they need to learn something from problem-based approaches. This scaffolding can take many forms, but without it, learning is a very hit and miss affair. Teachers need to adopt a range of strategies to scaffold the kind of knowledge that will feed into problem based learning activities. For example you can carefully guide students through base skills and knowledge needed, and then set more open-ended projects that build on this learning. Or you can set open-ended projects and make knowledge available where needed as students explore what they need to discover to solve the problems set. Alternatively you can pursue a mix of these approaches, giving some instruction up-front, and then supplementing knowledge where it is needed.

But another approach is also possible. If students are working in groups, you can use a jigsaw technique. Students can be split into expert groups and work on guided assignments so that they become experts in one aspect of the overall task. Groups are then formed by taking a member from each expert group so that each group has multiple experts in different aspects applicable to the task. They then teach each other what needs to be known to perform the task. For example, if a project uses spreadsheets, one expert can be trained in formatting cells, another in creating formulae, another in functions, one in graphing, and so on. When they combine they should be able to tackle a task which requires all these skill sets, and students should learn from each other.

An alternative approach is to have groups work on sub-tasks which combine to solve a larger problem. Each group works on a task which, combined, comes together to provide an elegant solution. For example one group might work on moving a robot, another on the operation of its crane, and so on. These approaches do, however, require different students receiving different instructional input. This can be achieved using a flipped classroom model. Even when I cover material in class, I make a video of the content and post it  on my Learning Management System so that it accessible to everyone, even if a student did not do that particular task at the time.

Having said this, managing different instruction for different students is a logistical problem. I prefer to have a wide range of tasks and either allow students to choose tasks they wish to pursue, or to focus tasks around building capacity for particular projects. An example would be the task below. By preparing task cards which set out the brief for any task, but also includes a hint about how to proceed, students can work at their own pace, or use the tasks to build up the skills needed to tackle larger projects. The task card may include very detailed steps to follow, or simply hints.

I have gamified the tasks, which is not necessary, but does, I believe, add a certain something. If you allocate Experience Points (XP) for completing tasks you can use the number of XP earned to unlock larger projects or challenges much in the way that XP allows characters to level-up in RPGs. This ensures that students complete as many tasks as possible, hopefully accumulating base skills needed for the challenges and projects. Because the hints and instructions are on the the reverse side of the task card, as a teacher you do not need to do a great deal of whole class instruction. Jigsaw groups, or sub-task groups can work relatively independently, typically after a whole class session when introducing a new application. If students are still stuck I am able to answer individual queries in class, and there are always the videos as back-up.

 

Computational Thinking – a new modality of thought or just what coders do?

I want to pose a question for consideration. There is a great deal of debate and disagreement over what Computational Thinking means. For some it describes how computer scientists go about what they do, akin perhaps to the scientific method for scientists (Wolfram, 2002), and is applicable only to computer scientists. For others it is a skill set that has implications beyond the field of computer science, a set of generalizable skills of benefit to all (Wing, 2006). A third view is that it represents something of a new mode of thought capable of unique explanations (Papert, 1980) and knowledge building. In this sense it goes beyond a set of procedures, like the scientific method, and might represent a mode of thought distinct from the paradigmatic (argumentative) and narrative modes of thought proposed by Bruner (1986).

The paradigmatic mode represents knowledge founded on abstract understanding or conceptions of the world,. For example, I could explain why an apple fell to the ground by referencing the theory of gravity. This is largely the language and understanding of Science. The narrative mode of thought represents an understanding of the world founded in human interactions. I might explain why an apple fell by referencing a sequence of events in which my elbow knocked it off the table and I was not deft enough to catch it. Of course there is a continuum along which both modalities of thought intersect and interweave. So, my question is whether computational thinking represents a separate mode of thought in its own right, or simply new combinations of paradigmatic and narrative modes. If I were to model a world of apples, elbows and tables, my understanding of why apples fall might be based on a more complete understanding of how apples behave under different circumstances. The use of computational models allows for new ways of understanding the world, new ways of gaining understanding and knowledge. Chaos Theory, for example, emerged out of computational model building. Paradigmatic formulations of the world followed from computational modelling, rather than the other way round.

When we create a computational model of a weather system and run our algorithms through computers with slightly different inputs to make a hurricane path forecast, for example, or use machine learning algorithms to predict heart disease more accurately, are we deploying a new kind of thought which is somewhat different from both paradigmatic and narrative modes?

The need to ask this question rests, perhaps, on the rapid development of Machine Learning and how it threatens to disrupt our world. Machine Learning has brought us to a point where we might soon be farming most of our thinking to intelligent machines. And while probabilistic approaches to artificial intelligence allow human beings to trace back what the machine has done with our algorithms: neural networks, with their black box approaches represent thinking that is to a large extent opaque to us. It seems entirely possible then, that in the not too distant future machines will be delivering to us knowledge of the world, and we will not be able to explain the thinking behind it.

The idea of Computational Thinking (CT) has a history, and it is interesting to unpack some of it. The term was coined by Seymour Papert (1980) and popularised by Jeanette Wing (2006) and there is general consensus that it refers to the thinking skills employed by computer scientists when they are doing computer programming, derived from the cognitive processes involved when you are designing an algorithm for getting “an information-processing agent” (Cuny, et al, 2010) to find a solution to a problem. For some, information-processing agents should refer only to machines, but for others it could include human beings when they are performing computational tasks. Differences in how applicable CT is beyond computer science hinges on these nuances of understanding. I have often heard it said that getting students to design an algorithm for making a cup of tea represents CT and if students were to study designing algorithms through leaning to code they would therefore be improving their general problem solving skills.These claims are difficult to assess, but they are important because if CT applies only to the context of computer science, then its place in the curriculum occupies something of a niche, important though it might be. If, however, as claimed, it leads to benefits in general problem solving skills there is a solid case to be made for getting all students to learn programming. Equally, the case for exposing all students to some coding might rest on other claims unrelated to the transfer of CT to other domains.

Let’s start by looking at the claims made by the Coding for all lobby. Wing (2206) argued that CT skills have transferable benefits outside of computer science itself because they entail five cognitive processes, namely:

  1. Problem reformulation – reframing a problem so that it becomes solvable and familiar.
  2. Recursion – constructing a system incrementally on preceding information
  3. Decomposition – breaking the problem down into manageable bites.
  4. Abstraction – modelling the salient features of a complex system
  5. Systemic testing – taking purposeful actions to derive solutions  (Shute, et al, 2017)

Wing’s claim has received a great deal of attention and has become the bedrock for the Computer Science for All movement, the idea that all children should be exposed to CT, by teaching them to code, both because such skills will become increasingly important in an increasingly digital world, but also because they equip students for the kinds of problem solving that is increasingly important. It is debatable, though, as to whether these cognitive processes are unique to computational thought. Abstraction and decomposition, in particular, might seem to be thinking processes shared by any number of activities. Wing’s thesis that computational thinking is generalizable to all other fields could perhaps be stated in the reverse direction. Perhaps general cognitive processes are generalizable to computation? This point is not trivial, but still might not threaten the thesis that learning to code or create algorithms is excellent for developing good problem solving skills applicable to other fields.

The question of the transfer of skills gained in one context to another is, however, fraught with difficulty. Generally speaking it seems to me that knowledge and skills are gained within the framework of a particular discipline, and that the application of knowledge and skills in other contexts is always problematic to some extent. There is a close relationship between knowledge itself and what we call thinking skills. It is hard to imagine, for example, anyone possessing dispositions and thinking skills in History or Mathematics without possessing knowledge in those disciplines. As Karl Maton (2014) has pointed out, all knowledge has both knowledge and knowing structures. There is the stuff that is known and the gaze of the knower. In different fields, knowledge structures or knower structures may have greater or lesser relative importance, but one cannot distill out something which is pure knowledge, or pure knowing. Therefore the question of the transfer of skills from one context to another, from one field to another, is not a simple one. Of course we do achieve this feat. At some point in my life I learned basic numeracy skills, within the context of elementary arithmetic classes presumably, and I have been able to apply this basic knowledge and skill set to other contexts, for example computer programming. But I am not so sure that the thinking dispositions I gained while studying History at University, and my appreciation for the narrative mode of explanation are altogether much use when thinking about Computational Thinking and what I ought to be doing as a teacher of ICT skills. I am painfully aware that there are limits to the general applicability of the enquiry and data analysis skills that I learned when training to become an historian. I did not train to become a computer scientist, and therefore I am very wary of commenting on how transferable skills in computational thinking might be to contexts outside the field. But I do believe we should be wary of claims of this sort. Peter Denning (2017) has argued that the idea that all people can benefit from CT, from thinking like computer scientists, is a vague and unsubstantiated claim. For Denning, the design of algorithms (algorithmic thinking) rests not on merely setting out any series of steps, but speaks specifically to the design of steps controlling a computational model. It is context bound.

My understanding from this is that the case for teaching everyone to code cannot rest solely on an argument that CT transfers benefits. This case has yet to be proven. It does not mean that teaching coding to all is not a good thing. I believe that learning to code represents a rigorous discipline which is good for the mind, has benefits because we are living in a world where computer programs are increasingly important, and because coding involves problem solving and this too benefits the mind. All in all I think the case for teaching coding to all is extremely cogent.

I also have this sneaking suspicion that the question I posed in my opening remarks is going to be raised more and more frequently as artificial intelligence gets applied, and if so, having a population trained in some level of competence with computational thinking is probably a really good idea.

Bibliography

Bruner, J. (1986). Actual Minds, Possible Worlds. Cambridge, Mass: Harvard University Press.

Cuny, Jan,  Snyder, Larry, and Wing, Jeanette. 2010. “Demystifying Computational Thinking for Non-Computer Scientists,” work in progress.

Curzon, Paul, Tim Bell, Jane Waite, and Mark Dorling. 2019. “Computational Thinking.” In The Cambridge Handbook of Computing Education Research, edited by S.A. Fincher and A.V. Robins, 513–46. Cambridge. https://qmro.qmul.ac.uk/xmlui/bitstream/handle/123456789/57010/Curzon Computational thinking 2019 Accepted.pdf?sequence=2&isAllowed=y.

Denning, Peter J. 2017. “Remaining Trouble Spots with Computational Thinking.” Communications of the ACM 60 (6): 33–39. https://doi.org/10.1145/2998438.

Guzdial, M. 2011. “A Definition of Computational Thinking from Jeannette Wing.” Computing Education Research Blog. 2011. https://computinged.wordpress.com/2011/03/22/a-definition-of-computational-thinking-from-jeanette-wing/.

Maton, K. (2014). Knowledge and Knowers: Towards a realist sociology of education. London, UK: Routledge/Taylor & Francis Group.

Papert, Seymour. 1980. Mindstorms: Children, Computers, and Powerful Ideas. The British Journal of Psychiatry. New York: Basic Books. https://doi.org/10.1192/bjp.112.483.211-a.

Shute, Valerie J., Chen Sun, and Jodi Asbell-Clarke. 2017. “Demystifying Computational Thinking.” Educational Research Review 22 (September): 142–58. https://doi.org/10.1016/j.edurev.2017.09.003.

Wing, Jeannette. 2006. “Computational Thinking.” Communications of the ACM 49 (3): 33–35. https://doi.org/10.1145/1118178.1118215.

Wolfram, Stephen. 2002. A New Kind of Science, Wolfram Media, Inc. https://www.wolframscience.com/nks/

 

Coding & Robotics Summit – Johannesburg

Artificial Intelligence and the Fourth Industrial Revolution have become buzzwords in education, often used with little thought or understanding in slightly Pavlovian ways. There is a very real sense that big changes are afoot, and everyone is nervous about how to respond, and most particularly to be seen to be responding. I am not quite sure what to make of it, frankly. While it is clear that the working landscape will change as a result of AI, I am not convinced that much in the educational field changes … until it does. What do I mean? I think that it is already crystal clear that education should be looking to teach critical thinking, collaboration and creativity. I’m not sure that anything has changed around this. It is my belief that we should have as broad a curriculum as possible. Drama, History, Music should be as core to our curriculum as STEM. It makes no sense to me to de-emphasise or over-emphasise any field. So while I am all in favour of ensuring that coding & robotics forms part of the curriculum, I find the whole STEM, STEAM, and now STREAM (with robotics) debate counter-productive.

There will come a point, however, at which Machine Learning is powerful enough that meaningful AI applications are ready for classroom implementation. When Watson and Skinner built their teaching machines in the last century they imagined programmed learning which allowed for instant feedback and personalised learning paths, the kind of thing advocated by Pestalozzi back in the late 1700s with his one-on-one tutoring methodology. What emerged though, was the kind of drill and kill learning platforms that are the kiss of death for education. Computers are simply not intelligent enough to be able to spot when students are gaming them. However, AI does offer a possible resurrection of the idea with systems that are far more responsive and capable of analysing student production with sufficient nuance as to be useful. Real-time feedback loops enabled by devices which can read the learning brain, are not Science Fiction anymore. Nor are teaching machines which can sift the huge amount of data collected and make sense of it. There will undoubtedly, then, come a point at which AI teaching machines enter the classroom. In the lead up to that we can probably expect a range of apps that employ AI in some way, and answer particular pedagogical needs. As I get older, face and name recognition would be nice! But full-blown AI in the classroom is a little way off yet. Meaningful data analytics is probably much closer, but I’m not convinced having a wealth of data is always a good thing. I am also afraid that that data will be harvested for purposes unrelated to education. Imagine how you could Cambridge Analytica a population if you owned the data being collected on how everyone learns and thinks?

The focus of this one day conference was on Computational Thinking and on coding and robotics as a vehicle for teaching thinking skills, building the habits of mind and dispositions necessary for a post-singularity world. The South African government has recently announced that it will be introducing coding into the primary and GET phase (middle school) curricula. Karen Walstra opened affairs by walking through the history of Computational Thinking and its component parts, how coding concepts can be used across the curriculum and not just in coding classes. I am planning a blog article on Computational Thinking so I won’t dwell on it here. Her talk was vital in terms of introducing Computational Thinking, and in laying thinking skills as the foundation of any curriculum changes. However, what worries me is that an increased interest in coding has elevated it beyond what I think it is capable of providing. Computational Thinking, Coding & Robotics is not a magic bullet which will suddenly solve all education’s ills. It is a necessary skill to learn, a useful knowledge base, and a set of dispositions that all students need, but it should be seen as forming part of the thinking skills programme, not replacing the existing curriculum. All subjects and all skills are vital, in different ways. Don’t get me wrong, I am all in favour of coding’s place in the curriculum, but learning problem solving skills requires a broad world knowledge, and there are a number of thinking skills beyond the computational that are needed. Important, yes. A magic bullet, no.

St Enda’s Secondary School students designing a school website, circa 2003.

What her talk did highlight was the notion that all students can benefit from learning coding. The team from CodeJIKa presented a cogent case for this with their wonderful extra-curricular code club programme teaching students HTML, CSS and a little JavaScript. They have an online curriculum which runs largely through peer to peer learning. When I was teaching Computer Applications at St Enda’s Secondary School in the early 2000s I used the same approach. HTML & JavaScript are browser based and so do not need compilers and can be used offline – a huge consideration where Internet connection is a big problem. To my mind starting with a markup language is also helpful because it is easier to slip into, helping students get into the habit of moving between the concrete and the abstract. You can then start to slip JavaScript in quite organically and start introducing key programming concepts. Robyn Clark, from CodeJIKA, stressed how web design is also helpful in building entrepreneurial skills, giving students a side hustle. The CodeJIKA approach is to my mind a fairly easily replicable model across under-resourced schools. It is also flexible and stackable as App development, robotics and programming proper can be added as the skills and knowledge base increases. Amini Murinda from ORT South Africa presented what they have been doing in expanding coding and robotics in a growing number of schools. Both programmes clearly show that coding & robotics initiatives are engaging and transformative.

We heard from two speakers representing robotics companies, who spoke about where robotics and AI is headed, and why we should not fear job losses, and how investing in coding in primary schools could reduce failure rates in higher education. These talks provided a useful backdrop and a perspective from the world of work. I would have liked a greater emphasis on curriculum, but the summit was useful in bringing together participants from industry, teacher training and secondary and tertiary teaching sectors. It would have been great if Government had also been represented. We need many more of these discussions.

The big take-away for me was the need to take these pilot projects, together with the experience of primary and secondary teachers from the private sector who have been developing their own programmes, share best practice and work on a curriculum and pedagogies that make sense.

 

Developing Tools to Help Students Construct Meaning in Computer Skills

As a teacher of computing applications I have found that the area my students struggle with the most is creating and using spreadsheet formulae and database queries. That is to say they struggle most where they have to apply mathematical formulae, which are by nature abstract, to a concrete task such as applying a 10% discount if certain conditions pertain. The ability to move seamlessly between abstract and concrete is not something all students possess. Piaget described the movement between concrete and formal operational thinking as a maturational process, with children only becoming capable of abstract thought at around 12 years of age. It is also thought that abstract thinking develops into adulthood as individuals gain more experience with it. This suggests that students need extensive scaffolding to help abstract thinking skills develop. It is also clear that it is difficult to generalise concepts across different contexts generally.

I have looked at the Semantic Wave Theory previously on this blog (eg. Maton, 2014), a framework drawn from Legitimation Code Theory, which shows how the movement between the abstract and highly condensed to the concrete, contextualised and simple can be used as a tool to show how meaning is being unpacked and re-packed within the classroom. Researchers have shown how successful teaching and learning depends on creating repeated movements between the two, describing semantic profiles.

The diagram above illustrates various semantic profiles, which will be instantly recognisable to any teacher. The high semantic flatline is when all discourse in the classroom remains at a general and abstract, very theoretical level. The low semantic flatline is when discourse is simple and practical. Clearly what is needed over time is movement between abstract and concrete, complex and simple, a wave-like graph with greater semantic range. Teachers need to help students understand complex, abstract ideas by unpacking the concepts using concrete examples, personal experience and metaphors. Students also need to learn how to repackage their understanding in more abstract academic language in their own words, and teachers need to carefully scaffold, this process.

Understanding semantic waves, helps to understand how best to scaffold spreadsheet formulae and database queries by finding strategies to strengthen and weaken semantic gravity and density as it is called, in other words to scaffold movement up or down the semantic wave. To do this requires an understanding of the relative strengths of semantic gravity and density in various computing applications. I have to say that this is in itself not an easy task. It seems to me that what appears to be a concrete, practical task for an experienced practitioner, often appears abstract and complex for the novice. This is perhaps just another way of saying that as we get used to traversing the gap between abstract and concrete we get better at doing it, and cease to notice it, or struggle with it. We operationalise abstract formulae without a second thought and it seems like a simple, concrete task to us. We need to try and see it from the perspective of the novice. The novice needs to bring together an understanding of what the computer needs to do expressed in plain language, the mathematical or logical language of the problem and the syntax of the application or programming language. And this process needs very careful scaffolding and support.

I have very recently come across a cognitive tool called the Abstraction Transition Taxonomy (Cutts et al, 2012). The illustration below comes from the paper cited and demonstrates one way of visualising the processes involved in coding a computer program, or indeed an excel spreadsheet.

This design process helps bridge the gap between understanding a problem and its solution and translating that into a working program which then needs to be debugged and checked to see if it does what it is supposed to do. The key stage is the story boarding in the middle. I like to think of the steps shown above as following the following stages:

  • Plain Language: Think about the problem and work through a solution in your mind
  • Maths/Logic: Build in any mathematical or logical operators into your solution
  • Application Syntax: Implement your solution using the particular syntax of the app or programming language you are using.

For example:

  • If a class has collected the most money in the school, they get the day off school.
  • If money collected = most money, then day = day off, else day = normal school
  • =IF(cell=max(range);”Day Off”;”normal school”) [in an Excel spreadsheet]

It is tempting to see each of these levels (plain language, maths/logic, app syntax) as discrete strengths of semantic gravity, moving from plain language (strong semantic gravity) to maths/logic (weak semantic gravity) and then back to app syntax (strong semantic gravity). This would describe a wave much like the graph shown below. This is a useful way to conceive of the shifts in levels of abstraction while using a computer to solve a problem.

Over the years teaching spreadsheets, databases and coding, I have come to develop a routine of modelling how to go about using computers to solve problems which follows the three steps enumerated above. It is summarised as the ELS method:

  • State the problem and solution in plain English
  • Plug in any mathematical or Logical operators
  • Enter it using the particular Syntax of whatever application you are using

This helps students, I think, by giving them a process to follow and helps move up and down the semantic range, but my grade 8s and 9s still struggle to apply it.

Although the three step process helps build in a movement up and down the semantic range, it is not enough. Each step represents a semantic range in its own right, for the novice at any rate. When stating a problem’s solution in plain language, one needs to hold in mind the contextual parameters of the problem and an ideational, abstraction of the solution in one’s mind. When working through the mathematical and logical expression of the solution, one needs to continually jump back to the context of the problem and forth to the emerging formula. When translating this formula into the particular syntax of the application you are using, also requires rapid jumps back and forth up the spectrum between weak and strong semantic gravity. Although the curve above may well describe the overall movement of meaning in the task, it seems to me to be made up of rapid oscillations back and forth between two states, abstract and concrete, a kind of quantum wave, if you like, as the student superimposes an abstract solution on top of a concrete problem’s solution. I believe it is this which makes it particularly difficult for novice programmers and spreadsheet/database creators to navigate the coding of a solution. More experienced programmers handle these shifts with ease.

How and Why Questions help move up and down the semantic range

When using the ELS method in a whole class situation I model the mental process of thinking through the task very closely, drawing on student contributions. But getting students to work in pairs is also very necessary as it forces them to voice their mental processes and this helps strengthen and weaken semantic gravity. If you are explaining why you think the formula should be this rather than that, you are effectively making jumps up and down the semantic range because you are dealing with why questions, which tend to raise the level of abstraction, and with how questions which help concretise your solution. When you try something and it doesn’t work, having to discuss possible reasons with a peer helps do the same.

Bibliography

Cutts et al., 2012. The abstraction transition taxonomy: developing desired learning outcomes through the lens of situated cognition. In Proceedings of the ninth annual international conference on International computing education research. ACM, pp. 63–70. Available at: https://doi.org/10.1145/2361276.2361290.

Maton, Karl. (2014). A TALL order?: Legitimation Code Theory for academic language and learning. Journal of Academic Language and Learning. 8. 34-48.

 

 

Meaning Making in Computer Education

One of the difficulties in looking at the knowledge practices of teachers of middle and high school computing is the diverse nature of educational practices around the world. In some contexts the curriculum resembles Computer Science at a tertiary level, with an emphasis on computer programming and the theory of hardware, software and networking. In other contexts, however, the emphasis is on computing applications. In South Africa, for example, students can take Information Technology as a matriculation subject, in which programming is studied, or Computer Applications Technology, with an emphasis on Office applications. At middle school levels the emphasis is often on basic computer literacy. Coding and robotics are, however, often taught alongside basic computer literacy and information literacy.

Waite, et al (2019) have argued that Legitimation Code Theory (LCT), in particular the idea that effective knowledge building practices involve the formation of semantic waves, provides a framework for assessing the effectiveness of practices in the teaching of computing by providing a common language for describing diverse practices. I have described Semantic Wave Theory before in this blog, But here is a brief summary.

Karl Maton (2014) has described semantic waves as how teachers try to bridge the gap between high stakes reading  and high stakes writing where ideas are highly abstract and context independent (Weak Semantic Gravity) and highly complex and condensed (Strong Semantic Density). In the classroom these highly abstract and complex ideas are introduced in the form of texts. Students are expected to reproduce these ideas in their own words in the form of essays and examination answers. In order to do this teachers need to help students by giving concepts greater context (Strong Semantic Gravity) and make them simpler (Weak Semantic Density). They do this by using examples, metaphors and personal experience. If you map the changes in semantic gravity and density over time you can describe waves. The ability to make links between abstraction and the concrete, between theory and practice, complex and simple ideas is what makes for effective teaching and learning.

Waite, et al (2019) show how a semantic analysis of an unplugged computer programming task describes just such semantic waves and makes for a successful lesson plan. They also suggest that using semantic waves to analyse lesson plans, and actual lessons, is a way of assessing the effectiveness of lessons teaching computer programming of different kinds. Many teachers use online coding platforms, like Codecademy or Code Combat. In this article I would like to look at a semantic wave analysis of a code combat course on web development to see what it reveals about its strengths and weaknesses as a pedagogical platform. Code Combat uses as its basic structure a series of courses covering a computer science syllabus teaching JavaScript or Python programming and some HTML and CSS. Each course is divided into a series of levels, and each level introduces key concepts such as loops, conditional statements and so on, using quests and tasks performed by an avatar. Students enter the code in a command line interface and can run to test success.The platform provides hints and text completion prompts to help scaffold activities.

Students generally enjoy the platform, and take pleasure in grappling with problems and succeeding at each task. I use it in my grade 8 & 9 computer skills classes. In this analysis I looked at the 13 levels that make up the Web Development 1 course, introducing HTML tags and CSS properties. I looked at Semantic Gravity alone. SG- (Weak Semantic Gravity) representing highly abstract ideas and SG+ (Strong Semantic Gravity) representing highly concrete ideas. I used three levels of degree to indicate strength and weakness (SG— to SG +++)

I used the following translation device for rough coding the level of semantic gravity, and looked at the instructions in each level. The purpose of a translation device is to help translate the theory into what it looks like in practice. What does Weak Semantic Gravity look like when using HTML, what does Strong Semantic Gravity look like?

SG – – – Over-arching concepts Tags used to mark-up text
SG – – Coding Concepts Tags do different things eg <h1> regulates size of a heading
SG – Properties of concepts Tags have properties eg <img> has source, alignment, width
SG + Examples of concepts students must decide which tag to enter
SG + + Examples of properties Student must edit a property eg <img src=”” align=”left”> change to right align
SG + + + Data entry Typing in text

The coding of the thirteen levels was done using only the text used in the platform. I did not look at the graphics. I would argue that the graphics display tends to scaffold all activities by strengthening semantic gravity and helping students visualise what they are coding. The semantic waves formed over time looked as follows:

What we can see is a non-continuous wave which loosely describes the movement between abstract and concrete. Each unit followed a pattern of introducing a particular concept and giving students a chance to practice enacting it. The next level would then introduce a new concept and so on. In some levels students are able to partially practice their developing understanding of the concepts by making choices of which tags to use rather than merely practising enacting the one explained. The movement between weak and strong semantic gravity has been described as a down escalator, and is common in teaching practice. Teachers are generally good at explaining concepts so that students understand them, less common in classroom practice and less common here is the full movement of the wave in such a way that students take concrete examples and are able to express the underlying concepts and display their own understanding effectively. In programming terms this would translate into being able to use concepts learned in novel situations to develop unique solutions, in other words move from a concrete problem to be solved to a conceptual enactment of that by designing an algorithm or coding.

What the semantic wave analysis seems to indicate is that the units in this course are doing a good job in explaining the programming concepts, but not good enough a job in giving students a chance to explore and display their understanding in new contexts. As a teacher, I have to say that this is what struck me immediately. The platform could do some things better than I could. It could allow students to work at their own pace and gave instant feedback, and was certainly more engaging with graphics and its game-like interface, but was not able to set more open-ended tasks, or give students a chance to explain their own understanding of the concepts. The course ends with a “design your own poster” exercise which potentially does this, but each level lacks a degree of this full movement through the semantic wave.

This weakness appears to be hard-coded in, and would require teachers using the platform to mediate as a way of creating fuller semantic waves. Given that students are working at their own pace, my own solution was to use mentors in every class. It was the job of the mentor, anyone who had already completed the course, to help peers who were struggling to complete levels by explaining what needed doing. The mentors at least were then consolidating their knowledge and understanding by explaining it to others, and mentees were benefiting from having the problem re-phrased or re-contextualized.

I would argue that semantic wave analyses like this one would help inform better instructional design decisions. It might appear as if I am being critical of Code Combat, but I believe that other platforms of a similar kind suffer the same weaknesses. This platform, in fact is better than most in using Constructivist learning principles by asking students to design their own solutions, but more could clearly be done to create full semantic waving.

Bibliography

Maton, Karl. (2014). A TALL order?: Legitimation Code Theory for academic language and learning. Journal of Academic Language and Learning. 8. 34-48.

Waite, J., Maton, K., Curzon, P., & Tuttiett, L. (2019). Unplugged Computing and Semantic Waves: Analysing Crazy Characters. Proceedings of UKICER2019 Conference (United Kingdom and Ireland Computing Education Research).

 

Fake News & Conspiracy Theories – Teaching Fact Checking!

The Cambridge Analytica story has foregrounded the imperative that we teach students to distinguish between fact and fiction online. All too often, however, the responsibility for this is left to librarians, who often lack sufficient contact time with students in which to do any meaningful work, or, even worse, left to no-one at all. Subject teachers have full syllabi in which detailed work on how to evaluate truth is hard to shoe-horn in. There needs to be some discussion over how this is taught explicitly and how it can then be used across the curriculum.

The standard approach to teaching students how to evaluate websites is to use fake websites which have been created for pedagogical purposes. Here are some examples:

Students are then asked to evaluate these websites, often in conjunction with legitimate websites, to detect which are hoaxes. Common evaluation techniques are usually based around a checklist of concerns: the CRAP Detection method, for example. CRAP stands for Currency, Reliability, Authority and Purpose (or Point of View). Students are asked to evaluate any website against these criteria and then give an evaluation. As an IT teacher, I have included this kind of thing in my curriculum for many years. At my school the headmistress felt I should not use a word like CRAP, so I had to invert the acronym, as in the poster shown here.

  • Currency: Is the information reasonably up to date? Does it matter in this case?
  • Authority: Can the author be trusted? Are they an expert in their field? Do they have a reputation? Authority can mean an individual writer or the website or publisher as a whole.
  • Reliability: Is the information factual or is it just an opinion? Does the author give sources so you can check up on what they are claiming?
  • Purpose: Can you detect any bias? Is the site trying to sell you something? Are they trying to persuade you about something?

There are some problems with this approach, however. Students find it very difficult to move from a checklist to an overall evaluation. Students tend to get bogged down in the detail and lose sight of the big picture. For example, a student may correctly identify the author as being suspect, but then rate the website as reliable because it is up to date. Or they may discount a website simply because it is anonymous. Because context means everything, and truth depends on a wide range of concerns, it is hard enough for adults to pick through the minefield of detecting fake news online, for a teenager it is doubly difficult. No single factor should usually be taken as definitive.

So much rests on possessing a robust general knowledge. I would argue that while checklists are useful, they need to be combined with a process-oriented approach which is better able to balance all the factors involved.

The use of fake websites (usually created entirely for the purpose of teaching website evaluation) is also somewhat problematic. More suitable for younger students, with older teenagers it is better to evaluate real world examples. Conspiracy Theory websites present a much more nuanced content base for honing evaluation skills. The problem, though, is that conspiracies are not necessarily fake, and even highly intelligent and critical thinkers can disagree over which should be taken seriously and which not. As recent court papers attest, drug companies do tell deliberate falsehoods and historians have exposed false flag operations such as the Gulf of Tonkin incident. And yet students need to be inoculated against undue trust in conspiracy theories. If you Google Climate Change on YouTube, most of the videos apparently question the reliability of scientific evidence. If you Google Vaccinations and Autism you might well be mislead into believing it is a real debate.

The only way to untangle fact from fiction is to have a world view which is based on a really good understanding of the Sciences and Humanities. Truth can be evaluated both on the basis of Coherence, that what is being claimed makes logical sense, and Correspondence with the real world, with data and evidence. Real facts can be totally misinterpreted, and logical claims can be based on shaky evidence. No checklist approach can really help untangle this, and yet evaluation needs to be based on a range of factors.

With this in mind I have, over the years, developed a model for teaching website evaluation which takes note of the factors, and tries to define an overarching process for evaluating coherence and correspondence. The poster gives some idea of the process, but I usually design worksheets customized for the particular task at hand with a space for answering the questions.

The first step is to complete an evaluation matrix. This can be calibrated in different ways, but produces an evaluation diamond which gives a graphic representation of the different factors. This allows the student to look into the Currency, Authority, Reliability and Purpose of the website, but to keep this information in the background. It does not immediately lead to an evaluation. The matrix, though, forms a visual reminder. The tighter the diamond the more likely the website is to be fake.

The student then answers four questions which are designed to get them to think about how the information presented corresponds with the real world and is coherent. It is only with the fourth question that students are asked to give an overall evaluation based on their gut feel. This is done to try and discourage making an evaluation until all other factors have been considered.

  1. Does the information fit with everything else you know about the world?
  2. Is the information confirmed in other sources?
  3. Does it make sense?
  4. What does your gut tell you? Give a rating from 1 (Fake) to 10 (Reliable)

Students seem to enjoy filling in the CARP diamond, and comparing the shapes they produce with others’ responses. Having a visual summary of the evaluation checklist really helps stimulate discussion. The four questions allow students to use a search engine to fact check the content and the author in greater detail. I would recommend that you scaffold this in any worksheet you provide. I always find it useful to get students to work in groups to evaluate a few websites, and then have a report back to the whole class where the group delivers its findings. You can use an online platform like flipgrid to facilitate feedback. By working in groups students are encouraged to voice their responses to the website and defend their points of view.

I believe it is also vital to correct poor findings – and yes, I have had groups make presentations that the tree octopus is real, or that dihydrogen monoxide (water) is a dangerous substance.

 

 
 
%d bloggers like this: