What does it mean to be digitally literate in 2012?

Out of 28,000 teachers who qualified in 2010, just three individuals had a computer-related degree. This makes me wonder just how tech-savvy a lot of 10 to 15 year olds really are.

When I was 10 in 1996, my parents purchased our first ever home computer. It had the latest Intel Pentium processor that clocked in at a whopping 120Mhz. To put that in perspective, most iPhones today run at ten times this speed. 

Things were tricky at first. Windows 95 took awhile to get used to and most video game developers still preferred MS-DOS as a platform because it provided a more stable environment. Direct X was in its infancy and the Xbox was over a decade away. Microsoft's idea of Plug and Play technology rarely applied and/or worked. 

Fifa 96 was a Christmas present that year, but due to MS-DOS being unable to correctly identify the computers CD-ROM drive or sound card, I didn't get to play it until February 1997. 

When attempting to run the game, instead of giving me specifics, the computer produced three words: 

'read file error'.

A RAM upgrade and  2 months spent ensuring that MS-DOS was in fact aware that the computer had a CD-ROM drive resulted in me falling behind with my homework. More importantly however, I could now play Fifa 96!

The experience taught me several valuable lessons. I had a better grasp of how to use a command line operating system and understood the basic relationship between software and hardware. It probably did a lot to improve my problem solving skills. At the same time, my classmates at school would share their stories of MS-DOS over a packet of crisps at lunchtime. A group of us were looking forward to studying computing as a stand alone subject. Unfortunately, computing at high-school involved very low level tasks that nearly everyone in the class could complete with relative ease. The closest we got to solving problems of any interest was the occasional bit of tricky programming.

But it wasn't enough. 

A lot has changed since 1996. Microsoft is no longer cool and I no longer feel compelled to play video games. A growing number of people do most of their work on a small computer that will rarely go wrong. About 90% of software works straight out the box and hardware really has become a simple plug and play process. Provided a PC meets the minimum requirements for a game, today's 10 year old is unlikely to run into the same problems I did in 1996. Even living room electronics are relatively straightforward to use in comparison to the now redundant VCR, which had to be programmed as if it might land a man on the moon. Even the programmers from NASA lost sleep after missing the last ten minutes of The Cosby Show

Children today will never experience the true joy of showing their parents how to correctly programme their VCR although they can probably teach them a thing or two about their smartphone. 

I recently purchased an AV Amp for my living room. It basically takes in all your video and audio sources and sends these out to your TV and speakers. Now unlike most other bits of kit for your living room, it is a tad more complicated to set-up. Sad to say I have actually really struggled to get this up and running. I was out of practice when it came to using a bit of tech that doesn't 'just work'. The ten year old me might have been more successful, but how would a 'tech-savvy' ten year old in 2012 cope? 

There is a lot of talk concerning why children need to become digitally literate, but are we really sure what digital literacy means? Being able to understand how computers work and how to fix them should be an essential part of any 21st century curriculum. Is that digital literacy? Any 10 year old can learn how to use PowerPoint, Facebook and a smartphone without any adult guidance. Is that digital literacy? 

To me, an education in real digital literacy would teach children about the nuts and bolts that make up a piece of technology by experimentation. This would more closely mirror how it is taught at university level. 

As everyday technology has become more reliable, what I would term 'real digital literacy' has taken a back seat. That isn't going to produce the next generation of British software developers. People of my age and older had to deal with computers that were relatively unreliable. In 1996, everyday software and the rise of the internet was constantly spurring on hardware development. People often upgraded their old machines as RAM was a cheap (ish) way of enhancing performance. In 2012, a modest laptop can do pretty much everything the average user needs with little intervention. I do like the idea of schools going back to basics when it comes to teaching computer science. Graphical limitations mean that students would be forced to internally visualise the problem. 

Computer science curriculums are undergoing a revamp at present, but given the rate at which technology changes, why has this taken so long? A quick glance at last years Scottish Computing Standard Grade paper reveals that little has changed since I sat the same exam in 2001. 





Comments

Popular posts from this blog

A universal skill-set for all psychologists?

State-Trait Anxiety Inventory (STAI): SPSS Script

Rosenberg self-esteem scale: SPSS Script