Friday, May 02, 2008

Natives schmatives!

This month's big question from LCB centres on whether it is necessary to design differently for digital natives.

For me, the first hurdle is that I question the relevance of the whole native/immigrant thing. I find insupportable the notion that there is a point in history which serves as a watershed.

Yes, there are people who were born after digital technology became part of the scenery, and those who were born before, but so what? Technology has existed since man rubbed two sticks together to make fire (and threw sharpened sticks at animals so that they could cook them over that fire). So we are all native some form of technology. Technological advancements are now happening so fast that people born today are native to a technology that people born last week, last month, last year are not.

Also, a person born in one part of the (so-called flat) world has no access to the technology freely available elsewhere, so being born into the "digital era" doth not a native make.

Many are the people, born well before the hypothetical digital watershed, who have taken to the technology without a murmur, while others born long after haven't the foggiest. This matter has been addressed in inimitable style (which will be utterly lost on anyone who struggles with the concept of irony) over the past few of days by the irascible Grandad, one of my favourite bloggers (language warning). On the other hand, my nieces, born indisputably into the digital era, can't blog about their difficulties with the technology, because they wouldn't have a clue how!

Rather than trying to divide people into two camps, I would suggest that we approach each project without preconceived ideas:

  • Who is my audience for this project?
  • What do they need in relation to this initiative?
  • What is the best way to meet their needs, taking into account all factors, constraints and available tools?
Then, when the next project comes along, throw away the information compiled for this project and start again with a blank slate.

One size does not fit all. One size doesn't even fit most. One size fits one size... and one shape.

8 comments:

Michael Hanley said...

Hi Karyn,
In the nicest possible way I disagree with you completely, but I'm glad somebody posted an alternative view!

I would assert that there are points in history where innovations are made that change things irrevocably: from the invention of the wheel, and of farming, right through writing, metal-working, printing, steam power, gunpowder (they're not all positive) to antibiotics and computers, there are watersheds.

Of course, it's the case that not every body benefits from these changes in equal measure, or at the same time. As Everett M. Rogers in his 1962 book, Diffusion of Innovations suggested, innovations spread through society in an S curve, as the early adopters select the technology first, followed by the majority, until a technology or innovation is common.

I would say that terms like "Digital Native" etc are simply an analogue or metaphor to characterise a cohort of people in a particular circumstance, in much the same way as the phrases "Baby Boomer" or "Yuppie" do. That said, I don' think access to all this cool stuff will make a blind bit of difference to you if you were born post-1990 in a slum in Peru or in on a farm in rural China.

But then, access to technology has always been the prerogative of the privileged; I'm sure a serf in mediaeval Europe didn't know nor care about the Renaissance either. Nevertheless, the explosion in ideas and knowledge at that time provided the foundation for the world we now inhabit.

I do think you're bang on about providing appropriate learning solutions for distinct audiences, and funny enough, that the technology now available to us will probably enable customised learning to align to individuals' learning needs and styles in ways unimaginable to us who grew up in "the old country".
--

Anonymous said...

Speaking as a non-academic, digital immigrant, I would like to add my tuppence worth.

I agree wholeheartedly with Karyn on this subject, and frankly find Michael's assertion mildly insulting. The argument that the "serf in mediaeval Europe didn't know nor care about the Renaissance" does not hold true here as at that time, access to national and international events did not exist to the common man. Nowadays, the majority of the worlds population must be aware of the digital revolution, whether they have access to it or not. There seems to be an inherent implication that we 'immigrants' should be given some sort of special treatment just because we were born outside the Magic Digital Circle.

I speak as someone who discovered computers in my forties, and became a professional programmer in my fifties. I am now retired but run my own web design and hosting business. I am aware that I am a bit of an anachronism, but I am not unique by any stretch of the imagination. I might add that I am also a qualified teacher and have taught web design to a wide age range and ethnic diversity. The only qualification needed to handle 'digital native' technology is a willingness to learn.

As a blogger, I find my readers have a very wide age profile, ranging from teenagers to octogenarians. The majority of my 'elderly' readers are bloggers themselves. In fact, I would go so far as to say that the digital divide is nothing whatsoever to do with ethnicity or age, but is merely down to interest or disinterest.

Michael Hanley said...

Grandad - thanks for taking the time to comment on my response to Karyn's original post.

First of all, I didn't intend to cause umbrage by the analogy I used in my comment, and certainly not insult.

If it would help move the discussion forward, I will clarify my position, beginning by defining a contentious term that has I believe been at the core of this debate, and that seems to strike to the core of peoples' self perception: what is an immigrant?

By random chance, I happen to be Irish (and proudly so); if I chose to move to the UK or Europe for economic or other reasons, that would make me an immigrant, or as Collins Dictionary says "a person who comes to a country to take up permanent residence." It's neither positive or negative, it's just a fact.

Marc Prensky's 2001 essay Digital Natives, Digital Immigrants is actually about failure of the education system to meet the learning requirements of those born in the final part of the 20th century. As he points out:

"Today's students have not just changed incrementally from those of the past, nor simply changed their slang, clothes, body adornments, or styles, as has happened between generations previously. A really big discontinuity has taken place. One might even call it a "singularity"—an event which changes things so fundamentally that there is absolutely no going back. This so-called "singularity" is the arrival and rapid dissemination of digital technology in the last decades of the 20th century...Our students today are all “native speakers” of the digital language of computers, video games and the Internet."

He goes on to suggest that those of us not native to "the digital world" but have adopted some or most aspects of the new technology are (and always will be compared to them) Digital Immigrants.

"The importance of the distinction is this: As Digital Immigrants learn – like all immigrants, some better than others – to adapt to their environment, they always retain, to some degree, their "accent," that is, their foot in the past. The “digital immigrant accent” can be seen in such things as turning to the Internet for information second rather than first, or in reading the manual for a program rather than assuming that the program itself will teach us to use it. Today's older folk were "socialized" differently from their kids, and are now in the process of learning a new language. And a language learned later in life, scientists tell us, goes into a different part of the brain."

From a cultural perspective, I agree in large part with what Prenksy says. However, I am not convinced by some of his assertions, particularly in the domain of digitally-mediated learning, and the ability of different groups to use technology effectively. Similarly, my view is that understanding how to use technology to teach benefits everybody. Equally, I don't think that it's likely to change that some people learn more effectively by visual, audial, or kinaesthetic means. What has changed? Digital natives' ability to process information from multiple simultaneous sources, and how they use that knowledge.

By extension, shouldn't the way kids are taught embrace the new tools and technologies to enhance their learning experience? When I was a post-primary student, one of the contemporary debates was "Should we allow pupils to use electronic calculators during their exams?" As a 15-year old it was blindingly obvious to me that learners should be allowed, even encouraged to take advantage of the available technology - when was I ever going to use a slide rule to carry out calculations, as my teachers had? Even then, I knew the answer was "Never", and 20 years later that has proved to be the case.

I would suggest that the topic as originally posted on the Learning Circuits Blog Big Question has gone off the point - a discussion about developing educational strategies that use digital technology to enhance learning - and has become some kind of moral or ethical debate. Certainly, I'm surprised at the depth of affect that the terms used have caused, and the way it has become a binary, "this good, that bad" dialogue that seems to carry a negative resonance and political overtones.

In her original comment, Karyn mentions that her "...nieces, born indisputably into the digital era, can't blog about their difficulties with the technology, because they wouldn't have a clue how!" I think that's an appalling failure of an education system that is supposed to be enabling these kids function in the world, and it's addressing these kind of deficiencies that this the debate should really be about.

--

Anonymous said...

@Michael I'm afraid I must continue to differ with you.

In my view, your analogy does your argument more harm than good.

You see, I am an immigrant. I am a South African living in the UK. I packed up my home, my family and my life and moved here 9 years ago. However, I never moved from anywhere to anywhere in terms of digital innovations.

I'm sure Ireland has changed around you in your lifetime. South Africa certainly changed immeasurably in mine. Does that make you an immigrant? I only became an emi/immigrant when I moved. Those of my friends who have stayed there are natives to a country that looks nothing like the one into which they were born. The analogy goes deeper, but I will leave it there for fear of stirring up an emotive, political situation in what is an academic debate.

The technological landscape has changed around us. We are a rich and diverse population. None of us moved anywhere - none of us came in from the outside.

Yes, my nieces have been failed by the system, but the education system can't carry all the blame. They have been failed by another group of people closer to home... in fact at home. After all, if parents want their children to know something that isn't taught at school, they have the option to teach them themselves.

Anonymous said...

Michael - You miss my point. I am well aware of your definition of an immigrant in this context, as one who is not native to the technology. Am I correct in that?

As someone who was born in the middle of the last century, I think I would qualify as someone who was not brought up in the digital age. Mine is more the age of crystal sets, and the BBC Home Service. However, I think I have adapted reasonably well to my new surroundings. I don't feel alienated. I do sometimes wonder at the obsession over iPods and their ilk, but generally I am very happy with the Internet, not only as my first port of call for information but also for online shopping. For the last ten or so years I have done all my grocery shopping online as I hate shopping centres.

I was a bit harsh in my terminology. I was not insulted or offended as such. A bad choice of words, and I apologise. What I was trying to say was that I felt that there was more than a hint of condescension in the implication that 'us older folk' should be spoken to with words of one syllable. I agree that I am familiar with the slide rule, and logarithm tables, yet I still use a computer to do my calculations. If anything, I have a distinct advantage over the Digital Natives in that I was taught to be self reliant on mental arithmetic.

I would almost go so far as to say that Digital Natives are the ones who are at a disadvantage. They have been spoon fed information and have become over reliant on the computer as the answer to all problems. Us older folk have the advantage of the older methods in addition.

I too remember the arguments about calculators in exams, though in my case, it was slide rules. I think the downfall of modern mathematical education was the calculator. While it is an incredibly powerful tool, and invaluable, it is not always available. This is amply illustrated if you ask a twenty year old to tot up a lengthy bill in their head. The majority will have difficulty!

I think that computers and the Internet are incredible tools. In my school days, they would have been deemed as belonging to the realms of magic, let alone fantasy, yet they are here. However, neither are infallible and neither are available at all times. My contention is that the the modern educational system has become over reliant on modern technology, and that some of the old methods should be incorporated. I do my grocery shopping online, but I am still capable of going into a shop myself [to use an analogy].

We dinosaurs can teach 'em a thing or two yet!

Michael Hanley said...

As our discussion has unfolded, I have begun looking at the literature to gain a better understanding of "the big picture" and the commentaries and opinions others have contributed to this topic over the last seven years or so.

It would be an understatement to say that it's contentious, and for an article written in 2001, Marc Prensky's original piece has generated a lot of discussion and opinions ranging from Bennett, Maton & Kervin's assertion that the idea of 'digital natives' is an "academic form of a moral panic," (The ‘digital natives’ debate: A critical review of the evidence, 2008) right through to a joint initiative of the Harvard Law School and the Research Center for Information Law at the University of St. Gallen in Switzerland called the Digital Natives project, which

"...explor[es] the impacts of this generational demarcation between those born with these technologies and those who were not. The project will address the issues and benefits of this digital media landscape and gain valuable insight into how digital natives make sense of their experiences online."

My view is that the liveliness of the debate demonstrates the currency of this topic. I would suggest that when developing new ideas, an "extremist" and even binary approach such as that used by Marc Prensky takes when defining his native/immigrant categories has value, in that it becomes a starting point that enables people to negotiate to a more accommodating, moderate view over time; it's a strategy employed every day, for example by trade unions when negotiating better pay and conditions for their members. This becomes problematic if such views harden and attain the characteristics of an infallible ideology though.

To be honest, the debate's interest to me is founded in the challenges and opportunities of teaching kids using the range of both traditional approaches and new technologies, rather than the semantics we use to describe it. I would suggest that one of the reasons that is is such as "hot" topic is that the whole notion of living in a digital world is so new that as a culture we're still grappling with the concepts (and implications) of this brave new world and consequently we struggle with the language we use to define it.

But what I've really been surprised about is the depth of feeling the subject has aroused.

I think that this is partly because of the socio-economic, political and cultural implications associated with the natives/immigrants terms used. I wonder would this be the case if Prensky had chosen to call the two groups he discusses 'Group A' and 'Group B'?

All labels are, by their nature, generalisations, and can lead to stereotyping (which in itself can have a negative implication, even if none is intended). However, we all use labels to categorise groups of people in certain situations, mainly I believe because it provides a linguistic short-hand so we don't have to go into tedious detail when discussing that group; don't we all use terms like Yuppies, Wags, Baby Boomers, Mods, Kiwis etc, just to easily capture the characteristics of a group when communicating with others? In this sense I feel it is valid to use terms like 'digital natives.'

Strange as it may seem, Karyn & Grandad, I agree with both of you more than we diverge; for example Karyn, I think you're right when you say that the responsibilities of educating kids don't stop at the school gate and parents and others have a role in kids' education. I believe that this can be achieved using the media and hardware they (the kids) are already familiar with. Look at the popularity and success of programs like Big Brain Academy for the Nintendo DS. Extending from this, the same company has a concept called "Touch Generations:"

"...games [that] do just that - they touch generations. It's about games that appeal to us all, no matter how old or young."

This is all very well as long as parents have the know-how to guide in these domains; many people don't have the time, the interest or the wherewithal to teach their kids how to contribute to the read/write web (and more importantly, how to appropriately interpret and filter the information available via the networked world).

This is the disconnect that formal educational institutions must bridge. However, innovative approaches to learning are not being taken advantage of in the formal primary or secondary educational domains, maybe because
a] it has never occurred to those involved at a policy-making/governmental level that you can educate in this fashion and
b] even if they have considered this, they don't know how to integrate these new ways of acquiring skills and knowledge with current continuous assessment and exam structures
c] probably a lot more reasons

Grandad - similar to your experiences using the slide-rule helping you to understand the mechanics of maths, I began to learn about computers on an Apple II and a Commodore PET, in an environment where you had to boot the OS via a 5 1/4-inch floppy in Drive A, before creating your BASIC or Pascal program via the disk in Drive B. I believe that learning about computers in this way shaped and advanced my understanding of the subject in ways that it would be very difficult if not impossible to quantify. While recognised the benefits of learning in this fashion for me, I doubt that I would have much success in persuading a contemporary 12-year old of the benefits of this approach. Much better to enable kids to explore the concepts of designing, building and programming things using tools like Lego Mindstorms.

There are certain skills and types of knowledge that have to be taught using plain old (and perceived as boring) repetition and drill-and-practise, such as learning the "3R's", music notation and scales, learning to swim etc, but there's no reason that acquiring this knowledge shouldn't be technology-assisted to a greater extent than it is at the moment.

At later stages of learning, most if not all kids would benefit from a more self-paced approach to learning, but one of the key challenges here is devising pedagogical approaches that motivate and satisfy childrens' innate curiosity and desire to investigate the world, while retaining a structure that ensures that the learning is directed towards specific educational goals that equip children to function effectively as adults.

In my original response to the LCBBQ (here: The E-Learning Curve: May LCBBQ) I said that:

"...the fundamentals ...won't change especially; people still learn in the same way, and as much as technology may change, we as a species are still lumbering around with our hunter-gatherer brains."

And this is the one constant in a world of change: people have adapted and generally prospered as new technologies have created new environments from the invention of farming 10,000 years ago, and will continue to do so .
--

Anonymous said...

@Michael. You are no doubt right that, for some people, the semantics are a sticking point. However, for me, it's not about the words - it's about the lines.

I am suspicious of any model that attempts to pigeonhole people on the basis of intangible, infinitely variable premises, which group A and B would do as decisively as the current terminology.

Just as I reject the notion of 4 (or 3 or 7, depending on which of the 70-odd existing models is your preference) learning styles, I reject the notion of two types of technology user.

Since my learners are work-based adults (usually in a corporate environment), I deal with a huge range of ages, cultures, etc. with every solution I design. I prefer to learn what I can about my unique target audience in each instance, rather than making pre-conceived assumptions based on age... or any other factor, for that matter.

Michael Hanley said...

I think my last word on the subject will be that I wouldn't call Marc Prensky's ideas about Digital Natives/Immigrants a model - more of a theory, maybe a hypothesis, and to be treated of in this context.

It's been a pleasure taking part in this stimulating intellectual exercise here on your blog, so thanks for taking the time to engage in such an interesting debate over the last few days.

All the best - Michael
--