RSS

Author Archives: Dorian Love

About Dorian Love

I teach ICT, Coding & Robotics, Thinking Skills and English at Roedean School (SA). I am passionate about educational technology and critical thinking.

Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Thinking Digitally Logo

What makes for good thinking? It is commonly assumed that it is our intelligence or abilities that have the greatest effect on our thinking. Ron Ritchhart (2002), however, makes the case that it is our dispositions, our acquired patterns of behaviour which activate and direct our abilities . More than our abilities themselves, our dispositions afford our capacity for greatness. These dispositions are shaped and formed by our interactions with others, by our culture, society and history (Ritchhart & Perkins, 1997). Dispositions, or habits of mind (Dewey, 2007) are pivotal because they determine when we choose to think, the importance we place on it, and how deeply we question the assumptions and narratives (Bruner, 1991) upon which thought is based, allowing new narratives to emerge.

For example, the habit of mind persisting suggests a disposition towards keeping at it: trying new ways to approach a problem, not being daunted by failure, worrying away at something until you succeed. This disposition applies in many different contexts, including digital spheres, such as computer programming. It is clearly a disposition which affords success. You cannot succeed if you give up too easily!

homLikewise the capacity for finding humour affords success in many different contexts. Costa & Kallick (2008) have identified sixteen such habits of successful thinkers. These habits do not describe the knowledge or skills necessary for success, but the dispositions which enable successful thought and action in the world.

In this article I will be arguing that Thinking Digitally is just such a disposition; a habit of mind which predisposes those who demonstrate it towards success in the same way that persistence or finding humour does. It is a disposition towards using digital tools to aid thought and action in the world; towards deploying algorithmic thinking as a way of generating explanations of reality, and using computational thinking as an approach to problem solving. These ideas will be unpacked below, but it is important to note that they share a concern with the disposition towards using digital devices to create, solve problems and understand the world we live in. I have chosen to use the term Thinking Digitally rather than computational thinking, which is the more commonly used term, because I believe that it describes a wider range of uses of digital devices, beyond how we organise tasks that are going to be automated by machines, mathematical modelling, the use of simulations, and simulators, or calculators and computers to include the ways in which we access and evaluate information, and how we author digital texts and communicate our understandings in the current era.

I will argue that Thinking Digitally is a logically separate habit or disposition rather than the application of any combination of the other sixteen habits in a new context, the digital world. Marc Prensky (2001b) has advanced the idea that people born after a certain date, whom he has dubbed digital natives, carry a natural disposition towards digital technologies. He argues that people born before the advent of the digital era are digital immigrants; that digital technologies are somehow foreign to them, and that while digital natives are able to use digital devices effortlessly, for immigrants there is no such natural facility. In other words digital dispositions are based on age. While this idea has gained widespread traction because it seems to explain why some people seem to take to digital technologies like a duck to water, and others struggle with it, or avoid it entirely, it is a notion that has been roundly critiqued, as we shall see below. I shall argue for the notion of a set of digital dispositions, based on certain characteristics or habits of mind rather than on age, carries weight.

In recent years, within the Computer Science community, teachers have begun to argue that Computational Thinking, often called Algorithmic Thinking should be integrated into the curriculum, not just as a separate discipline, but as a foundational subject. I will argue that Computational Thinking and Algorithmic Thinking forms a new mode of thought, different from the paradigmatic or narrative modes of thought, and that the predisposition towards using this mode of thought forms the basis for arguing in favour of a separate habit of mind. I will argue that a disposition towards using computational and algorithmic thought, and allied to this using digital media to access and legitimate knowledge, forms part of an evolving revolution in thought, which is transforming the world, and towards which some people appear to be more disposed, and others less disposed. Acquiring the habit of digital thinking clearly advantages some, and leaves others floundering in an increasingly digital world.

If we look at recent ideas around epistemology, theories about learning and cognition, pedagogy and literacy, we can begin to discern polar opposites which, while they do not define neat categorizations of analogue and digital thought, do point to ways in which certain polarities are afforded by either analogue or digital modalities. We are living in an Information Age (Castells, 2011), in which the digital storage and processing of information in digital forms is increasingly central to our knowledge and economic survival, and this privileges certain ways of looking at and thinking about the world. Our ability to succeed in this altered landscape is largely dependent upon our dispositions and habits of mind, not just on our knowledge or skills. Some of these dispositions have already been described by Costa and Kallick, but some are unique to digital contexts, and revolve around how we think about our relationship with digital cognitive technologies.

Khaneman (2011) has argued that there are two types of thinking which operate. System I thought is fast, intuitive thinking in which we come to rapid assessments based on our accumulated habits of thought, and received judgements handed down to us. System II thought, on the other hand, is slow thinking in which we carefully reason through a position, considering the evidence and reaching a judgement based on a logical process. We need to rely on System I thought most of the time. We cannot tell life to wait while we carefully reason over a position. System II thought is, however, vital because if allows us to build better instincts, and base our intuitive responses on more solid ground. In other words, it frames our habits and dispositions. Successful critical thinking is based on building solid habits of mind on which we can depend when we don’t have the time to carefully consider our responses.

In much the same way, I want to make the case that the salient feature of our use of digital technologies lies not in the skills and abilities we bring to any given task, but in the dispositions and habits of mind that unlock what we are able to accomplish with technology. Behind digital literacy lies, in other words, a set of mental habits and dispositions which are what allow the power of the technology to be unleashed. My capacity to use a database is one thing, but it is my digital dispositions which determine when I choose to use a database, for what purposes, and to what effect. Their dispositions are what allow some people to use technology to unlock their creativity and solve problems rapidly and effectively, and what constrains others to use it merely for repetitive or largely meaningless tasks, or not to use it at all.

What are the dispositions and habits of mind of successful people when they are using digital devices and media? Do these dispositions form a logically discrete entity or are digital technologies simply a different context in which the habits and dispositions (Costa & Kallick, 2008) that apply to analogue thinking are applied?

In working through this argument I am going to make a number of moves which are intended to present the case that there are indeed a set of uniquely digital dispositions, and that it makes sense to speak about thinking digitally as an overarching approach that successful thinkers bring to their use of technology, which shapes and guides effective digital practice. Indeed I will argue that these digital dispositions may also be brought to analogue contexts. In other words, we human beings, in learning how to think effectively when using digital tools, may have acquired some dispositions which we can carry back into the real, analogue world.

Computational Thinking

computational-thinking-white-bg

The first move is to argue that one set of digital dispositions hinges around the thinking that we do when we interface with machines. A frequently used term is Computational Thinking (Wing, 2006), which is used to describe the “thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information processing agent” (Lee, 2016). It is essentially taken to mean, then, the thought processes of one who is using a machine to solve a problem, and it is normally constituted as being based on three cognitive pillars: abstraction, automation and analysis. When we use cognitive technologies to automate or perform tasks, we need to program the machine, we need to break the task down into sub-routines that can then be automated, and we need to think the task through, understanding it from the point of view of what it is that the machine can and can’t do. The illustration above (“Reading A Book or An Hour of Code?,” 2016) is typical of attempts to break down what constitutes computational thought.

Proponents of Computational Thinking in the curriculum argue that “algorithmic thinking” should form the fourth R of the curriculum: Reading, ‘Riting, ‘Rithmatic & ‘Rithm (Grover & Pea, 2013). The two terms appear to be used interchangeably. What then is Algorithmic Thinking?

Algorithmic Thinking vs Narrative & Paradigmatic Thought

paradigmatic narrative

Bruner (1991) sets out a dichotomy between two modalities of thought: the narrative and the paradigmatic. The paradigmatic mode, the scientific mode, seeks to logically categorize the world and test hypotheses of what is. The narrative mode, on the other hand, is concerned with the meaning that is ascribed to lived experience. When explaining the world, why an apple falls to the ground, for example, one could use paradigmatic thinking, appealing to a concept, the theory of gravity. But one could also use the narrative mode to explain why an apple falls to the ground: the apple falls because I knocked it with my elbow, and it rolled off the table and fell to the ground. There is nothing necessarily superior in either explanation. Both have explanatory power.

The advent of complex calculating machines, however, has furnished us with new ways to go about constructing an explanation of the world. Computer modelling offers ways of constructing a model of how tables, elbows and apples might behave under different sets of circumstances. Using a computer, and inputting slightly different sets of data and observing the outcomes can generate a more complete understanding of how apples roll and fall and allow us to make more accurate predictions. The sheer number of calculations necessary to sustain this mode of thought was largely impossible before the advent of computers, but it allows us to predict weather, and find distant planets.

paradigmatic narrative algorithmic

Those who have advanced the idea of an algorithmic mode of thought would be hard pressed as to where to place this in Bruner’s schema. Algorithmic thinking is not necessarily about categorizing the world, or logically predicting outcomes, nor is it about narrating an explanation, although it may involve both these things. It shares with paradigmatic thought a concern with establishing procedures, but shares with narrative thought a concern with sequencing. It should probably then been seen as a separate mode of thought. In the diagram on the right, I have represented this by placing it off to one side.

But, as we noted, there are considerable overlaps, so it would probably be better to represent this schema as a series of overlapping circles. This schema has the advantage of allowing for greater nuances. Bruner himself, for example shows how narrative thought lies behind much of scientific explanation.

paradigmatic narrative algorithmic 2

Behind the algorithmic mode of thought lies mathematical modelling of the world and the methodology of simulation. Mathematical modelling, enabled by the power of computers represents powerful new ways of acquiring knowledge of the world. Evolutionary Engineering, for example seeks to use comparisons between random designs to reach more efficient designs in the fewest number of steps. Rather than seek to design complex solutions from principle (paradigmatic view), by comparing one design to another, and selecting the better one, repeating this process, optimum designs can be achieved in relatively shorter periods of time.

Cognitive Offloading & Cognitive Technologies

Because we have such small working memories (Sweller, 1988) we constantly need to offload our cognition onto the environment. We use language and writing to achieve most of this, but we also use our fingers when counting, or devices such as calculators, calendars, books, and increasingly these days digital devices such as cell phones, computers or the Internet. Dror (2007) argues that digital tools have become so pervasive and integral to our thinking processes that they cease to be external aids and become internal components in our cognitive processes. Cognitive technologies, in other words, have become instrumental in shaping our cognitive processes. New forms of distributed cognition (Clark & Chalmers, 2010) have arisen, in which our thinking is distributed across machines and across other people (Dror & Harnad, 2008), radically altering our cognitive landscapes.

These new theories about cognition and knowledge, encompassed within what is known as the Second Cognitive Revolution in psychology (Miller, 2003) clearly represent radical new ways of looking at what it means to be human, and how we see the relationship between man and machine.

Computational Thinking & Digital Thinking

Jeanette Wing, who introduced the term Computational Thinking, described it as “a universally applicable attitude and skill set everyone, not just computer scientists, would be eager to learn and use” (Wing, 2006). Nevertheless, and despite a growing receptivity to the idea that Computer Science should be considered a core subject in the curriculum (Grover & Pea, 2013), and despite the growing integration of ICTs in our lives, and educational system, Computational Thinking is still seen largely as part of the Computer Science curriculum rather than a mode, or disposition of thought which pervades all subject disciplines. Lee argues that there is a tendency to conflate Computational Thinking with the operational steps involved, with computer literacy or with information literacy (Lee, 2016). This misses the point that Computational Thinking is in essence the dispositions of thought that human beings use when they are using machines to automate thinking process, or doing cognitive offloading onto machines.

I think part of the difficulty lies in the use of the term computational. As Wing stresses, the aim of Computational Thinking is not to get human beings to think like computers, but to describe how human beings think when they are using machines to solve problems. The term computational is so firmly rooted in the mathematical and engineering sciences that it tends to preclude the use of computers for perhaps their most vital task, that of communicating and accessing knowledge. How knowledge is legitimated and disseminated lies at the centre of our cognitive identities, and the new digital technologies are in the process of transforming the rules of the game. I believe that it is more useful to speak of how human beings organise their thought processes around problem solving using machines as digital thinking rather than computational thinking. This semantic change allows us to conceptualise computers as communication as well as computational devices and to see the dispositions of thought as including both purposes.

Digital technologies have transformed our practices in any number of ways, but it is important to note that every shift in practice represents a mental shift as well. It changes how we approach problem solving in subtle but important ways.

Digital Thinking vs Analogue Thinking

The second move is to claim that human thought is undergoing something of a revolution, and that particular aspects of our cognition are being accentuated and transformed by the rapid advances in digital media and what this affords. These changes are sufficiently momentous, and the contours of such change sufficiently discernible that we can begin to talk about a difference between digital and analogue thinking. These differences are not absolute; we have not acquired new brains overnight, but they are polarities which are impacting on our cognitive lives. I believe that those aspects of our thought which are afforded by digital technologies have been growing in importance, in line with the development of the information economy and the new world order, which has altered our discursive landscape (Gee & Lankshear, 2006) and may well be altering our brains as well. Neuroplasticity ensures that changes in practice have an impact on our brains (Prensky, 2001b), and it must be assumed, our cognition also.

analogue v digital

That we are experiencing change is not in contention, but how to characterise that change is. There are perhaps two possible views on this. The first, what we might term the hard view, is that Digital Thinking represents a new mode of thought afforded by the new digital media. This modality of thought has significant points of difference to Analogue Thinking to warrant identifying a range of habits and dispositions of mind which signal successful practice. This view sees cognition as being revolutionised. Stevan Harnad (1991), for example, talks of the Fourth Cognitive Revolution, as described below, and this has obvious ramifications for our behaviour and habits. The soft view, on the other hand, sees cognition as being largely unaltered by digital media, but that new approaches and dispositions are needed for dealing with the differences inherent in using digital media. The illustration above attempts to characterise two polarities of analogue and digital thinking as cutting across the different modalities of thought we discussed previously (Paradigmatic, Narrative and Algorithmic thought).

I do not believe that we can assess either of these views, whether cognition itself is changing, or whether our habits of mind are changing does not essentially alter the schema which can be discerned emerging from the literature around digital literacies, cognitive psychology and computational thinking. In either event it makes sense to differentiate the contours of how we perceive the terrain as altered by the introduction of digital technologies.

In an always-on, always-connected world, as more and more devices become connected in the Internet of things, and as connectivity becomes more commonplace and more integral to our lives, many are asking questions about what this means for our cognitive development. Headlines scream that the Internet is making us dumb (Carr, 2011; Thompson, 2013) and neuroscientsists suggest that our brains are being re-wired (Siemens, 2014; Greenfield, 2014) for good or ill. Psychologists suggest that it is affecting our memories (Sparrow, Liu, & Wegner, 2011) or that it is affecting our social lives (Wooley, 2013). There is a sense in which we stand at a crossroads, uncertain of both the direction and meaning of the rapid changes which are shaping our society. Utopian dreams and dystopian fears permeate the debate around the extent to which technology is driving change. Advances in Artificial Intelligence raise the prospect that we will soon not be the only sentient beings on the planet, and usher in a Brave New World where the difference between Human and Machine may not be that clear cut.

I would argue that never before have we needed a roadmap for navigating the future as much as we do now. Moore’s Law (Gibbs, 2015) which states that the capacity for data storage will double every 18 months, while applying to data, is a metaphor for the rapidity of change, and expresses the sense in which the rate of change is exponential. I would argue that what is required is a new set of habits and dispositions, strategies for dealing with this change, and with the changes associated with the Internet revolution. Our old ways of thinking, the habits and dispositions of centuries may not be enough to guide us in the future.

The advent of the printing press saw the introduction of a medium of mass communication which has had far-reaching implications for our history, society, and thought. When books were only available to a select few, and knowledge dependent upon individual transmission, knowledge was hierarchical and legitimated by an appeal to authority alone. In the thirteenth century Peter Abelard initiated a revolution in knowledge (Campbell, 1991), an appeal to evidentiary support in challenge to authority, where even theology had to bow before reason, but it was only with the invention of the printing press, and the Industrial Revolution allowing for the mass production of books that knowledge could be distributed more widely, ushering in the ideological sea change represented by the Enlightenment. How knowledge is legitimated, disseminated and reproduced changes completely how individuals think, learn, create and solve problems.

I would argue that we are at a similar turning point represented by the new digital media. The ability to self-publish, afforded by the Internet, has changed how knowledge is disseminated, altering legitimation practices. Many academics publish material from their blogs, and the speed of publication ensures peer review, but not in the conventional sense. An example of the future is wikipedia, that repository of crowdsourced knowledge said to be as accurate as Encyclopedia Britannica. Stevan Harnad’s notion of academic “sky-writing” encapsulates the idea of the Internet affording the immediacy of oracy with the reflective power of literacy to bring about the Fourth Cognitive Revolution. Harnad’s (1991) schema is tabulated below.

First Cognitive Revolution Oracy Hundreds of thousands of years ago Communication immediate and interactive Growth of Wernicke’s area and Broca’s area in brain?
Second Cognitive Revolution Literacy Tens of thousands of years ago Allows for reflection between communications Growth of Exner’s area in brain?
Third Cognitive Revolution Printing Press c. 1450 AD Revolution of scale
Fourth Cognitive Revolution Internet c. 1990s AD Near synchronous communication: brings together immediacy and reflection Are our brains being rewired?

This periodisation carries with it a sense of cognitive affordance offered by the new digital media of near synchronous communication. Ideas can be disseminated rapidly, enabling quick responses and affording collaborative thinking, but with enough time for some reflection before responses are made. The last column, linked to possible neural developments is highly speculative, and Harnad himself makes only tentative suggestions in this regard, but feeds into a contemporary literature suggesting that the move from page to screen carries with it some neural rewiring.

News and opinions are shared rapidly across social media, replacing traditional news sources. In a world where breaking news is not mediated by the professional journalistic eye, but blogged, tweeted and facebooked by participants, we need to reassess how we access knowledge about the world. Content is similarly mediated by new mechanisms. In a world where opinions are discussed through the vehicle of the share, the like and the Internet meme, we need to see knowledge as a commodity that is rapidly traded across the globe in relatively unmediated ways.

I think we can begin to discern the basis for an argument that we are seeing a revolution in cognition, a shift away from analogue, towards digital modes of communication and cognition.

Digital media do not replace analogue channels. As a species we still rely on our senses. Nevertheless, digital media represented by Internet and other telecommunication technologies, represent new modes of communication which have far-reaching consequences for how we communicate, think, learn, create and solve problems.

How our thinking is mediated, and the types of cognitive offloading that are afforded by these technologies may be represented by the following chart which compares Digital and Analogue Contexts:

Thinking Digitally Analogue Thinking Discussion
Hyperspace The five senses Push technologies in particular afford the delivery of information from sources not limited to our biological five senses.

 

For example I learn what the weather is outside from my cell phone as opposed to looking out the window.

Fast paced Slow paced Acquiring data in an analogue world is slow and deliberate, gathering data in a digital world is almost immediate and less considered. Speed has advantages and disadvantages and calls for different approaches to evaluating data.

 

I can use a search engine to find out about something rather than going to a public library and finding a book on the subject.

Multi-channel (digital media streams) Single channel (biological senses) While data in the analogue world comes at us via the multiple channels of our senses, the digital world opens up even more channels, some of them referencing analogue channels, others entirely in cyberspace. This has advantages and disadvantages and calls for different approaches towards data collection.

As I type this, I am receiving data from my cell phone, emails and alerts are appearing in my inbox and I am able to use the Internet to search for information. This is on top of what I sense about the world around me through my biological senses, which tends towards a single channel (my senses).

Parallel Processing Serial Processing The speed at which different channels of communication can be accessed affords multi-tasking. Of course, given the severe limitations imposed on our working memory, we can never truly multi-task, we can certainly take on tasks simultaneously in the analogue world. However, the plethora of extra channels extends our ability to work on many levels at the same time.

I tend to take on more tasks in parallel while using digital media. Partly because it is more efficient to do so, but partly because my mind is parcelling out sub-tasks which will require time because they are asynchronous communications – for example posting a query for help in solving a problem on a forum. While I wait for a reply, I write an email and watch a Kitty video on YouTube.

Hypertextual Screen Reading Linear/narrative Page Reading Readings in hyperspace are hypertextual rather than linear or narrative. Hypertextual reading affords rapid synthesis and evaluation. Linear, narrative readings afford a closely argued train of thought.

Using a website I can rapidly skim for relevant information by following links and using the Find or search function. If I want to follow a train of thought I have to read sequentially.

Fast and Wide Thought Slow and Deep Thought Multi-tasking and hypertextual reading affords the rapid synthesis and evaluation of ideas, the identification of patterns and Fast Thought. Linear and narrative readings affords closely reasoned consideration of a train of thought.

Most decisions we make are the result of Fast Thought, but digital media afford this by making more information available and allowing us to find patterns more actively. Page reading, however, affords deep thought because I have to follow the sequence of words in the order they were written.

The Reader Creates The Text The Writer Creates The Text Hypertext is epistemologically predisposed towards a constructivist paradigm where the reader constructs their own meaning from the world around them. Texts are created by the reader, following links.

Text is predicated on an author communicating and transferring a message, and is epistemologically neutral. The text is created by the writer.

Constructivist Pedagogical Paradigm Instructivist Pedagogical Paradigm Theories about how we think and learn have changed considerably over the last hundred years or so, with a shift towards theories stressing how meaning is constructed rather than perceived.
Learning by participation Learning by acquisition Sfard (1998) has argued that there are two metaphors for learning and both are valid – learning by participation, and learning as acquisition.
Symbolic Manipulation Symbolic Creation Fast thought affords rapid symbol manipulation, while slow thought affords the creation of new symbols.

Digital media allow us to rapidly manipulate information in many forms, but page reading allows us to generate new concepts.

Computational (Digital)

(Algorithmic) Thinking

Narrative & Metaphoric Thinking Jerome Bruner has shown the importance of narrative in our cognition. If we think about an issue we tend to do so by relating it. We also think metaphorically, comparing things to something similar in our experience. Computational thinking is something different – it operates at the level of simulation and modelling.

Digital media allow us to create simulations by modelling a problem and playing it out rather than thinking through a solution by relating its contours. This produces unexpected and counterintuitive results.

This schema is presented as polarities of what must inevitably form a continuum. Nor do I believe that they form a rigid dichotomy. Far more accurate to say that there is a dialectical relationship between the two.

The Foundations Of Digital Thinking

This characterization of twin polarities, operating in a dialectical (dialogic) relationship rather than in opposition, allows us to argue that there are several elements in human thought which, if not new, are at least being emphasised and foregrounded by the growing centrality of computers and ICTs in our lives. If we accept that even more important than our abilities and capacities are our dispositions, our ingrained behaviours and responses to the problems and challenges we face, it becomes clear that how we react to this shift in technology from analogue to digital is crucial in what we can accomplish.

What elements then constitute the dispositions of successful users of computational and digital media? If we look at successful use of digital media I think we can start to identify a number of elements.

Embracing Change

We all know that some people seem to embrace change and use it to their advantage, while others appear phased by change and shy away from it, or appear less imaginative and productive in the ways in which they use it. This has been falsely framed as a generational divide (Prensky, 2001a); that those born after 1985 (digital natives) are somehow wired differently, and have a natural affinity with ICTs which older people (digital immigrants) lack. While our brains may well be wired differently, if I am following this debate correctly, neural plasticity implies that all users of new technologies, regardless of age, will experience changes. There is a great deal of research indicating that this notion of a differently wired, differently thinking generation is a myth (Kirschner & van Merriënboer, 2013). There is no evidence to suggest, for example, that a new generation of youngsters are predisposed, somehow, to self-directed, self-paced, collaborative learning because their brains are somehow wired this way.

The divide is really one of disposition. Some youngsters are “immigrants” as much as some older people, and some older people are completely at home with technology. Exposure to, and comfort with a technology does not necessarily imply a capacity to use the technology in the most beneficial ways, in and of itself. For example, technology may afford self-directed learning, but it is one’s dispositions which determine whether or not these opportunities will be taken up. These dispositions generally need to be taught, and herein lies the central paradox of education. Discovery learning, as attractive as it sounds is ultimately a deeply flawed idea because of the scholar’s dilemma; how can you discover something until you know it is there to be discovered?

For example. Our brains, constrained by cognitive load, limit the ability to multi-task (Kirschner & van Merriënboer, 2013; Sweller, 1988). Our dispositions and habits, however, may allow us to handle multiple tasks with greater facility, overcoming the limitations of our biology to some extent.

Successful thinkers tend to embrace change rather than balk at it. Thinking Digitally means that you are constantly ready to adapt to change, and see it as an opportunity rather than a constraint.

Production vs Consumption

They say a picture is worth a thousand words. There is all the difference in the world between creating and editing pictures and viewing them. It is not, however, that the one is good and the other bad. There are skills and sensibilities, dispositions and capacities which are involved in both activities. To read a picture effectively requires a thorough understanding of the genres and contexts in which the picture was taken. Producing a picture also requires skills and abilities, understandings and literacies. I would argue then that we cannot privilege authorship above criticism. Both have value and a place. We are all at some stage or another a producer or a consumer of both analogue and digital technologies and products.

Nevertheless, there are good habits and dispositions around production, and likewise for consumption. These dispositions are likely to be very similar in analogue and digital contexts. Having said this, however, we need to highlight one vital difference.

All technologies require knowledge and skills both around production and consumption. But some technologies are clearly more complex than others and require specialised skills and knowledge. While almost everybody learns to read and write, very few learn to write software code. While most people can use a screwdriver, relatively fewer can use a soldering iron. This complexity factor limits and constrains the use of all technology. But digital technology represents a level of complexity which makes this division between people who can and people who can’t more stark. It is relatively rare to hear someone say of any analogue technology that it is simply beyond them and they cannot use it in any way. And yet this attitude towards digital technology is frequently heard.

A person who is relatively handy can tinker with, and fix a range of analogue technologies without ever being an expert in any of them. And yet, increasingly this tinkering is not possible with digital technologies as the technology requires such specialised skills and knowledge that the ordinary “handyman” cannot fix a problem.

I am no motor mechanic, but even I have been able to get a Volkswagen Beetle motor running again with a little logic, a prayer, and a spanner. There is no way that I could do the same with modern cars with their electronic complexity and sealed units. The growing complexity of our machines threatens to turn us all into helpless consumers, and robs us of any ability to tweak and tinker which an essential part of being an effective producer.

Personal computers were first introduced very much within a tinkering culture; the first computers were shipped with no applications – users had to create their own programs. These days the reverse is true – computers and devices are loaded with a wealth of applications, and programming is both unnecessary and discouraged. Production within each application is further consumerized by a move towards authorship using templates and wizards. Computing has moved from a strong producer ethos to one which encourages consumption. The rapid growth in computer usage explains this shift, but I would argue that as the user base expands so does the need to actively encourage producer dispositions, which are being lost by the ease of consumption.

Growing social complexity and knowledge specialization means that increasingly we are reliant on others to produce and fix, what formerly we could tinker with ourselves. This leaves us often helpless. While knowledge and skills specialization accounts for some of this movement from producer to consumer cultures, much of it is driven by underlying habits and dispositions. We can see in the Maker Movement (“Maker culture – Wikipedia, the free encyclopedia,” 2016) and in the thinking behind Computational Thinking, a growing reaction to this trend. As analogue technologies become increasingly digitized, I would argue that the need to foster producer mentalities and dispositions should be seen as a core aspect of Digital Thinking.

Successful thinkers display dispositions towards producer mentalities. Thinking Digitally means that you are able to Hack Your Life.

Creativity vs Drudgery

We face a future in which Artificial Intelligence will transform all aspects of our economy and society (Cellan-Jones, 2014). While we do not know what the future holds in store, I would argue that we need to see the growing technologization of society as an opportunity, and need to emphasise a disposition towards creativity rather than drudgery. Technology may be used to free up our lives for leisure and creativity, or for drudgery and work. Andrew Feenberg (1991) described the central contradictions between the potentialities for control and democratization inherent in technology, and pivotal to speculation over the role of technology has been dystopian and utopian visions of the future linked largely to these polarities.

Do we face a future in which a few use technology to control and pacify the many, in which humanity at large is debased by a digital divide in which they are passed over, or will technology free the majority from lives of drudgery to unlock their full potential? This is very much a political choice, and depends upon our social organization. The sharp debates around the politics of Austerity reveal the contours of the choices we have to make within the new world order, whether to accept Neo-Liberal Taylorist arguments about the nature of Capital, or to seek Socialist alternatives and a more equal distribution of wealth. The stark choice is whether we valorize capital above labour, or see technology as an opportunity to unlock human potential in ways not previously imaginable.

We urgently need a disposition towards creativity over drudgery in the ways in which we approach our use of technology.

Jane McGonigal (2011) has written about what she calls the four superpowers of games, and how these powers may make a better world. These superpowers form perhaps the dispositions of successful gamers. McGonigal argues they are the crucial dispositions for life itself. They are:

  1. Urgent Optimism
  2. Blissful productivity
  3. Social Fabric
  4. Epic Meaning

Her characterization of gaming as empowering in gamers a hopeful expectation of success in conjunction with productive work within a social network to accomplish meaningful tasks expresses a disposition towards optimistic, humanistic outcomes which resonates with a view of human purpose which is vastly at odds with the Malthusian vision of humanity as mere cogs in a machine.

Successful thinkers display dispositions towards creativity and unlocking their human potential. This requires constantly finding new ways to express yourself through technology, rather than allowing it to be used to marginalise your humanity. Thinking Digitally means that you are alive to the human within the machine.

Conclusion

In this exploratory paper I have attempted to argue that the most crucial element of our emerging relationship with technology is the dispositions and habits of mind we bring to our use of machines to automate and extend our thinking. The digital divide is often conceptualised as a divide around access to devices, but in reality it is a mental divide between those whose dispositions empower human creativity and problem solving, and those whose habits of mind limit the realm of the possible.

Bibliography

Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Castells, M. (2011). The Rise of the Network Society: The Information Age: Economy, Society, and Culture. John Wiley & Sons.

Cellan-Jones, R. (2014). Stephen Hawking warns artificial intelligence could end mankind – BBC News. Retrieved April 8, 2016, from http://www.bbc.com/news/technology-30290540

Clark, A., & Chalmers, D. J. (2010). The Extended Mind. In The Extended Mind (pp. 26–42).

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Dror, I. E. (2007). Cognitive Technologies and the Pragmatics of Cognition. John Benjamins Publishing.

Dror, I. E., & Harnad, S. R. (2008). Cognition Distributed: How Cognitive Technology Extends Our Minds. John Benjamins Publishing.

Feenberg, A. (1991). Critical theory of technology. Oxford University Press, USA.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Kahneman, D. (2011). Thinking, Fast and Slow. Penguin UK.

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do Learners Really Know Best? Urban Legends in Education. Educational Psychologist, 48(3), 169–183.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Maker culture – Wikipedia, the free encyclopedia. (2016, March). Retrieved April 8, 2016, from https://en.wikipedia.org/wiki/Maker_culture

McGonigal, J. (2011). Reality is Broken: Why Games Make Us Better and How They Can Change the World. Random House.

Miller, G. A. (2003). The cognitive revolution: a historical perspective. Trends in Cognitive Sciences, 7(3), 141–144.

Prensky, M. (2001a). Digital Natives, Digital Immigrants: A New Way to Look at Ourselves and Our Kids.

Prensky, M. (2001b). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Reading A Book or An Hour of Code? (2016, March 22). Retrieved May 10, 2016, from https://lh3.googleusercontent.com/-bQWsEynX-8A/VgGCHw2AdrI/AAAAAAABWTU/_y9VVvBQYgM/w1728-h1728/computational-thinking-white-bg.jpg

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A. (1998). On Two Metaphors for Learning and the Dangers of Choosing Just One. Educational Researcher , 27(2), 4.

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

(Bruner 1991)Bruner, J. (1991). The Narrative Construction of Reality. Critical Inquiry, 18(1), 1–21.

Campbell, J. (1991). The Masks of God: Creative Mythology. Penguin Group USA.

Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. W. W. Norton & Company.

Costa, A. L., & Kallick, B. (2008). Learning and Leading with Habits of Mind: 16 Essential Characteristics for Success. ASCD.

Dewey, J. (2007). Human Nature and Conduct: An Introduction to Social Psychology. Cosimo, Inc.

Gee, J. P., & Lankshear, C. (2006). The New Work Order: critical language awareness and “fast capitalism” texts. Discourse: Studies in the Cultural Politics of Education. http://doi.org/10.1080/0159630950160102

Gibbs, S. (2015, July 9). Moore’s law wins: new chips have circuits 10,000 times thinner than hairs. Retrieved March 22, 2016, from http://www.theguardian.com/technology/2015/jul/09/moores-law-new-chips-ibm-7nm

Greenfield, S. (2014). Mind Change: How digital technologies are leaving their mark on our brains. Random House.

Grover, S., & Pea, R. (2013). Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher , 42(1), 38–43.

Harnad, S. (1991). Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Public-Access Computer Systems Review, 2(1), 39–53.

Lee, I. (2016). Reclaiming the Roots of CT. CSTA Voice, 12(1), 3–4.

Prensky, M. (2001). Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently? On the Horizon, 9(6), 1–6.

Ritchhart, R. (2002). Intellectual Character: What It Is, Why It Matters, and How to Get It. John Wiley & Sons.

Ritchhart, R., & Perkins, D. (1997). When Is Good Thinking? In D. Dai & R. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development. Mahwah: NJ: Erlbaum.

Sfard, A., & Anna, S. (2014). Metaphors for Learning. In Encyclopedia of Science Education (pp. 1–4).

Siemens, G. (2014). Connectivism: A Learning Theory for the Digital Age. Retrieved from http://er.dut.ac.za/handle/123456789/69

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778.

Thompson, C. (2013). Smarter Than You Think: How Technology is Changing Our Minds for the Better. HarperCollins UK.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33.

Wooley, S. (2013). Constantly Connected: The Impact of Social Media and the Advancement in Technology on the Study Abroad Experience. Elon Journal of Undergraduate Research in Communications, 4(2). Retrieved from http://www.studentpulse.com/articles/822/constantly-connected-the-impact-of-social-media-and-the-advancement-in-technology-on-the-study-abroad-experience

 
Comments Off on Is Thinking Digitally a New Habit of Mind or Simply Old Habits in a New Context?

Posted by on December 6, 2019 in Computational Thinking, Critical thinking, Habits Of Mind, Learning Theories, Pedagogy, Thinking Digitally, Thinking Skills

 

Teaching ICTs – Computational Thinking Pedagogies & Thinking Strategies

As more and more schools start teaching computing explicitly in one form or another to all students, the focus moves from advocacy, getting computing for all into schools, towards pedagogy – how best to teach it. There are a number of pedagogical approaches and cognitive strategies that have been promoted, but I would like to look at just one which I have found effective over the years. I think it would be fair to say that many teachers of ICTs are self-taught, and perhaps for this reason, the discovery method is still remarkably popular. The influence of Seymour Papert’s Constructionism, allied with Constructivist learning principles has made direct instruction far less prevalent in the computer class than in perhaps any other classroom. Nevertheless, a lack of direct instruction, I believe, threatens to undo many of the benefits of a discovery learning framework.

Discovery learning is predicated on setting problems which will allow students to learn through grappling with trying to find solutions, drawing on their past learning and knowledge. But there is the danger that no learning will take place at all if there is no scaffolding of the discovery process, and ironically, the greater the reliance on student-centered learning, the greater the need for directed teacher interventions. The greatest weakness of the discovery method is encapsulated in the scholar’s dilemma. How do you discover something that you don’t know exists? You need a more experienced other to, at the very least, nudge you in the right direction.

Teachers who use discovery learning, need to be careful to make sure that students have the tools they need to learn something from problem-based approaches. This scaffolding can take many forms, but without it, learning is a very hit and miss affair. Teachers need to adopt a range of strategies to scaffold the kind of knowledge that will feed into problem based learning activities. For example you can carefully guide students through base skills and knowledge needed, and then set more open-ended projects that build on this learning. Or you can set open-ended projects and make knowledge available where needed as students explore what they need to discover to solve the problems set. Alternatively you can pursue a mix of these approaches, giving some instruction up-front, and then supplementing knowledge where it is needed.

But another approach is also possible. If students are working in groups, you can use a jigsaw technique. Students can be split into expert groups and work on guided assignments so that they become experts in one aspect of the overall task. Groups are then formed by taking a member from each expert group so that each group has multiple experts in different aspects applicable to the task. They then teach each other what needs to be known to perform the task. For example, if a project uses spreadsheets, one expert can be trained in formatting cells, another in creating formulae, another in functions, one in graphing, and so on. When they combine they should be able to tackle a task which requires all these skill sets, and students should learn from each other.

An alternative approach is to have groups work on sub-tasks which combine to solve a larger problem. Each group works on a task which, combined, comes together to provide an elegant solution. For example one group might work on moving a robot, another on the operation of its crane, and so on. These approaches do, however, require different students receiving different instructional input. This can be achieved using a flipped classroom model. Even when I cover material in class, I make a video of the content and post it  on my Learning Management System so that it accessible to everyone, even if a student did not do that particular task at the time.

Having said this, managing different instruction for different students is a logistical problem. I prefer to have a wide range of tasks and either allow students to choose tasks they wish to pursue, or to focus tasks around building capacity for particular projects. An example would be the task below. By preparing task cards which set out the brief for any task, but also includes a hint about how to proceed, students can work at their own pace, or use the tasks to build up the skills needed to tackle larger projects. The task card may include very detailed steps to follow, or simply hints.

I have gamified the tasks, which is not necessary, but does, I believe, add a certain something. If you allocate Experience Points (XP) for completing tasks you can use the number of XP earned to unlock larger projects or challenges much in the way that XP allows characters to level-up in RPGs. This ensures that students complete as many tasks as possible, hopefully accumulating base skills needed for the challenges and projects. Because the hints and instructions are on the the reverse side of the task card, as a teacher you do not need to do a great deal of whole class instruction. Jigsaw groups, or sub-task groups can work relatively independently, typically after a whole class session when introducing a new application. If students are still stuck I am able to answer individual queries in class, and there are always the videos as back-up.

 

Computational Thinking – The Ideal Knower?

The debate around the concept of Computational Thinking often revolves around a central distinction between those who see Computational Thinking as a fundamental skill useful beyond the field of computer science alone and applicable as a general problem solving tool (Wing, 2006), and those who warn that this view may make exaggerated claims (Guzdial, 2011; Denning, 2017). To my mind, the most useful way to look at Computational Thinking is to see it as first and foremost part of the extended knowledge practices of computer scientists and assess the transfer of knowledge and skills as a separate issue. After all, there is transfer of knowledge and disposition across all fields of human knowledge. Academia builds strong silos, but knowledge is often advanced by those who step outside their silos.

Karl Maton (2014) building on the work of Basil Bernstein and Pierre Bourdieu, argues that all knowledge is made up of both knowledge and knower structures. Uncovering the ways in which these knowledge/knower structures legitimate knowledge claims helps uncover the largely hidden codes to academic success.

We can describe knowledge (epistemic relations) along a continuum from weak to strong. Weak epistemic relations indicate fields where knowledge itself is relatively unimportant. Where epistemic relations are strong, knowledge is crucial in legitimating knowledge claims. Equally we can describe knowing (social relations) along a continuum from weak to strong. Weak social relations indicate fields where who you are as a knower is relatively unimportant in legitimating knowledge claims. Strong social relations, however, indicate fields where the dispositions and gaze of the knower define legitimacy in the field. If we set epistemic and social relations out on a cartesian plane as in the diagram, it allows us to identify different knowledge/knower codes.

Some fields emphasise one or the other. For example, knowledge in Science is mostly dependent upon the knowledge content – it represents a knowledge code. Who is doing the knowing, their ways of seeing and knowing is largely, but not completely irrelevant. By contrast in the field of film criticism, an encyclopedic knowledge of world cinema alone does not guarantee legitimacy, Far more important is how the critic approaches film, how they structure and validate their arguments. Here the knower is emphasised – a knower code – having a cultivated gaze is crucial. The knowledge itself is almost irrelevant. Where both are crucial to legitimating knowledge/knowing we have an elite code. For example in Music. Where neither is important – a relativist code – what you know and who you are is largely irrelevant, all perspectives tend to carry equal weight.

It seems to me that viewing all knowledge from this knowledge/knower perspective helps to illuminate much of the debate around Computational Thinking. CT is usually defined as a set of procedures as follows:

  1. Problem reformulation – reframing a problem so that it becomes solvable and familiar.
  2. Recursion – constructing a system incrementally on preceding information
  3. Decomposition – breaking the problem down into manageable bites.
  4. Abstraction – modelling the salient features of a complex system
  5. Systemic testing – taking purposeful actions to derive solutions (Shute, et al, 2017)

What is clear is that this describes a set of dispositions, ways of approaching problems, ways of seeing rather than the set of knowledge structures that make up legitimate knowledge in computer science. If you look at the syllabus of a typical computer science degree programme, you will get a fair idea of the what that needs to be studied. It largely revolves around the analysis of algorithms and programming design to enable data handling, software design, and increasingly machine learning. The definition of CT does not describe the knowledge, but rather the knower structures of computer science. It sets out what one might consider the characteristics of the ideal knower. It describes how an ideal computer scientist looks at their field, in much the same way as the Scientific Method describes how an ideal scientist approaches their field.

The clear value of the notion of CT, rests, therefore, in laying bare what constitutes legitimate knowing in the field of computer science. It reveals the rules of the game quite explicitly. Because computer science is founded on well developed knowledge structures it represents a knowledge code in Maton’s matrix. Who you are is far less important than what you know. If you are able to master the mathematical knowledge and understand the algorithms necessary for producing computational models of the world that is quite sufficient to make you a computer scientist. But, as Maton points out, all knowledge has both knowledge and knower structures. For many students these knower structures are often occluded. Curriculae often make explicit the knowledge content requirements, but leave unsaid the subliminal characteristics that make up the ideal knower in the field.

If it is correct to say that CT defines the ideal knower dispositions, ways of being, seeing, doing, then computer science is fortunate in having these dispositions set out explicitly, offering clear pedagogical guidelines.

Bibliography

Denning, Peter J. 2017. “Remaining Trouble Spots with Computational Thinking.” Communications of the ACM 60 (6): 33–39. https://doi.org/10.1145/2998438.

Guzdial, M. 2011. “A Definition of Computational Thinking from Jeannette Wing.” Computing Education Research Blog. 2011. https://computinged.wordpress.com/2011/03/22/a-definition-of-computational-thinking-from-jeanette-wing/.

Maton, K. (2014). Knowledge and Knowers: Towards a realist sociology of education. London, UK: Routledge/Taylor & Francis Group.

Papert, Seymour. 1980. Mindstorms: Children, Computers, and Powerful Ideas. The British Journal of Psychiatry. New York: Basic Books. https://doi.org/10.1192/bjp.112.483.211-a.

Shute, Valerie J., Chen Sun, and Jodi Asbell-Clarke. 2017. “Demystifying Computational Thinking.” Educational Research Review 22 (September): 142–58. https://doi.org/10.1016/j.edurev.2017.09.003.

Wing, Jeannette. 2006. “Computational Thinking.” Communications of the ACM 49 (3): 33–35. https://doi.org/10.1145/1118178.1118215.

 

EduTech Africa 2019 – Coda

Last week I attended the EduTech Africa 2019 Conference in Johannesburg and would like to wrap up my thoughts on the conference with a few observations. Now that the dust has settled the thing that sticks out most in my mind is the clear recognition of the rise of Computer Science as a K-12 academic discipline. The government’s commitment to rolling out IT as a subject, and the focus on coding across all age groups has established a clear sense that Computational Thinking and Computer Science belongs in the core curriculum in all schools. The big question is then how we get there. The announcement recently that PISA Assessments, which offers international benchmarks in Maths & Science, will now include Computational Thinking and Computer Science is confirmation of this. Most of the talks I attended addressed the issue of how best to teach Computer Science in some form or other. Passionate teachers shared their best practice, and their failures. So, the coda to my reflections on the conference is really to address that question. Is there a best method to teach Computer Science?

NS Prabhu (1990) in answering the question of whether there is a best method of teaching or not, concluded that the key factor in teaching success lay with the teacher’s sense of plausibility, the teacher’s sense of self belief that what they are doing makes sense, how passionate they are. There is clearly a great deal of plausibility around the teaching of Computer Science at the moment. Obstacles are being dealt with as opportunities, and there is a very real sense that inventiveness and creativity can overcome the constraints of budget and lack of training.

The clear consensus amongst teachers seems to be that physical computing forms the best approach. Most presentations highlighted the use of coding in conjunction with 3D printing and robotics. My very first exposure to teaching computing was with Seymour Papert’s (1980) logo system. I did not have the turtles, using only the computer interface, but I tried to make it more concrete by using physical cards with shapes students had to emulate. Computer Science is a very abstract subject and needs to be concretised for students as much as possible. The cost of all the kit needed to do this is prohibitive.

I recently came across micro:bits which uses a web-based platform for coding. The code created is then downloaded as compiled hex code to the microbit chip which executes the code. But crucially it also has a web visualisation tool, which executes the code in the code editing window. The micro:bit controllers are themselves fairly cheap, but having a visualization tool means that more students can code at any one time. A class would need fewer physical chips at any one time. I have not yet been able to test the real thing, but it seems to me a perfect fit for the kinds of physical computing tasks I would wish to introduce. It uses a block coding interface, but you can toggle to program in JavaScript or Python, making it ideal for transitioning between block-based coding to the text-based fare students will need higher up the school. You can also design 3D printed parts for interesting projects.

But I digress, back to best methods. Another strong thread in the conference was computing for problem solving. I have to say that I am a little dubious about the whole Computational Thinking leads to better problem solving generally. I believe it leads to better problem solving in computational contexts, but transfer of skills from one context to another is always problematic in my view. Nevertheless, I do believe that students should be given real world problems to solve as far as possible and Computer Science teachers are leading the way in envisioning how coding could form a central plank in cross-disciplinary problem solving exercises. There was a great deal of talk at the conference about the need for teachers to “come out of their silos.” There is certainly no need for CS teachers to set projects divorced from the real world, or set problems narrowly about computers.

The final method that was raised at the conference was unplugged computing, an approach which involves modelling algorithmic thinking without a computer. For example students might be asked to write code to control a class-mate acting as a robot to perform a certain task. A talk by a primary school teacher on coding in the junior years had us all playing rock, paper, scissors. I’ve forgotten why, but it was great fun!

 

In the end, my take-away from the conference was to think about the best approaches for my own classes. And most particularly how to integrate all three of these approaches better. To my mind this is the best sort of take-away!

 

Bibliography

Papert, S, 1980. Mindstorms : Children, Computers, and Powerful Ideas. Basic Books. https://dl.acm.org/citation.cfm?id=1095592.

Prabhu, N.S, There Is No Best method – Why?, TESOL Quarterly, vol. 24, issue 2 (1990) pp. 161-176
 

EduTech Africa 2019 – Day 2

The second day of the conference started with a series of keynotes focusing on the what, the content of education. Hayden Brown walked us through The Big History Project, an online course which looks at the history of the universe and our place in it. What struck me was the affordances offered by technology to take the curriculum to places beyond the reach of an individual teacher, but I sensed in Hayden’s story the power of a passionate teacher to make a difference.

Hadi Partovi talked about the need to teach computer science in all schools. His code.org has done amazing work in promoting coding in schools, and represents a clear imperative to address a world where computer science is increasingly important. What struck me was how obvious that seems, and how important training teachers is going to be. I am not convinced by his argument that teachers can be reskilled to teach computer science that easily. There is not just a knowledge base to be learned, but a way of thinking too, and I’m not sure a short course can do that.

The drive to address an uncertain future in which we have to prepare students for a world of work that has not been created yet remains at the forefront of everyone’s attention. How to use the tools available, not just to enhance teaching, but also to empower students as authors and creators emerges as a central theme running through many of the talks. Whether the tool is a robot, VR goggles, a 3D printer or a Google doc, the central message of the day was how to make students the authors of their own stories, the architects of their own learning.

I have to end this summary of the day with a comment I overheard at lunch, that this is all very well, but conferences like this speak to those who are already enthusiastic about tech. But what of the rest of faculty? What of teachers who shun technology? How do we include them?

I have no answer to that question. But I do believe that the pool of teachers who are enthusiastic adopters has increased exponentially, and will continue to do so. Perhaps we have already reached a critical mass, a tipping point. Many teachers are quiet adopters, who have integrated technology into their classrooms without fanfare, sufficient to their purposes. Not everyone needs to champion a cause.

A bigger question is how to extend access to schools and teachers who fall on the wrong side of the digital divide. Lack of equity comes in many forms, but the most crippling way in which inequality is reproduced is the uses to which technology is put. Some students are being taught to think critically and be creative with tech, others to capture data in a mindless way.

That, to my mind is the big question that all teachers need to address.

 
Leave a comment

Posted by on October 10, 2019 in Conferences, Uncategorized

 

EduTech Africa 2019 – Day 1

The tenor of this year’s conference is more subdued. Last year there was a clarion call for ending school and doing away with qualifications altogether! This year the call to transform education is rather more muted. The tone is one of quite confidence, perhaps, rather than radical fervour. The line-up of keynote speakers, of course, largely drives the tone of the conference, and voices from industry were notably absent on the first day, their place taken by teachers, by and large. The sense I get is that teachers have it under control. We’ve been playing around with how to integrate tech into our classrooms for years, and we are starting to get the hang of it!

Many of the talks I attended celebrated best practice. There is less and less advocacy every year, and more and more certainty both that technology is an important driver of change, but also that it has limits. I heard almost no voices of radical disruption, and some even voiced reminders not to throw traditional teaching methods out the window. On the floor of the exhibition hall educational publishers and the Interactive Whiteboard vendors have had to make room for a wave of robotics, 3D printers and coding solutions. Government’s move to introduce coding into the core curriculum has clearly put dollar signs in many eyes! As one who is actively looking for this kind of hardware for my grade 8s and 9s for next year, I had to force myself to look the other way. Never speak to sales reps. They all make the same claims and don’t understand pedagogy. I listened as hard as I could to my colleagues instead. So many teachers doing so many great things! We spoke about our successes, but also about our failures and dreams.

The voices raising concerns over the lack of equity, the digital divide, were also muted this year, and here I must voice my own concern. The drive to solve educational problems through technology inevitably privileges the already privileged. Our two tiered education system must not be allowed to reproduce itself this way. Teachers like myself, who work in well resourced schools have an absolute duty to pilot best practice that does not exclude by being prohibitively expensive. Government and Academia has a duty to encourage the wider adoption of this best practice. It always disturbs me when politicians open a conference and then spirit themselves away without staying to listen to the voices of the rank and file conference attendee. Sadly our deputy minister did just that, so I do hope he managed to get some listening in.

Right at the end of the day a panel of experts reflected on the effectiveness of e-learning, and the question that caught my ear was how to train the computer science teachers needed as the new plan to teach computer science to all is rolled out. CS is a discipline in its own right, and suggestions that you can retrain teachers to teach code is ludicrous. CS teaching has its own pedagogical concerns and while teachers may well, and probably should learn how to code easily, it takes training and experience to learn how to teach coding. Imagine you were thinking of retraining a cohort of Physical Education teachers to teach English, for example, and I think you’ll see what I mean. No offence intended – I certainly couldn’t teach PE to save my life! I sincerely hope that this point is understood by all stakeholders. There is still this mistaken belief out there that students are better at technology than teachers are and so kids can really teach themselves coding. Of course some can. Many of us are self-taught, but for the majority of students there is no substitute for a well trained, experienced teacher.

To my mind the tenor of the first day was all about this – the need for well trained, experienced teachers.

 
 

Using Online Citation Creators

One of the huge bugbears for students when writing essays is the whole process of in-text citation and bibliography. There are no substitutes for good old-fashioned teaching around how, and when, to use citations in text and how to go about creating a bibliography, but the collation of bibliographical information and formatting of bibliographical entries has always been problematic, for students of all ages. Thankfully there are a number of websites available, free to use, which allow you to create bibliographies with a minimum of fuss and bother. They all work in fairly similar ways, and offer similar services, usually with premium versions offering long term storage of citations, plagiarism checking and so on. It is easy enough to find a free one such as EasyBib, which you can use to generate website, book, journal and a range of other entries. Users are asked to type in the URL or title of book or journal article. The website then searches for the information and offers a suggested bibliography item. Most services allow users to add additional information not captured. You can then copy the bibliography and paste it into your essay.

To my mind the thought that students need to put into citations should be on the in-text part, rather than the formatting a bibliography part of it. Having a handy online tool liberates teacher and student to concentrate on this aspect. Most online services offer an opportunity to copy the in-text citation as well as the bibliographical entry, but I prefer to get students to do this themselves. How hard is it to extract author and date information? You also need to make sure that students are able to check what information is being generated for accuracy and update missing data where necessary. Getting students to work in pairs to do this is a good idea.

If students are using different websites, get them to rate the accuracy they achieve and make recommendations to each other.

 
 
%d bloggers like this: