Gaming has received a lot of bad press over the decades, with studies linking it to short attention spans and violent behaviour in impressionable youths. But technology has come a long way in recent years, and with the Hong Kong Government pumping xx of investment into xx, is it time we turned gaming on its head and used it for the greater good? xx Simon John gives us food for thought.
It’s the year 2000. I, along with my friends, am sitting in our Year 8 history class back in Essex, England, studying Ancient Rome and the achievements of that time. Our teacher, having already made us produce homemade news articles detailing the exploits going down in the Colosseum and analytical style pieces questioning the reliability of accounts from that time, comes out with a proposal none of us were expecting, “The film Gladiator is out — a very accurate piece of cinema in my view. I suggest you go and see it.”
I was younger than 15 years old at the time, and did not sneak into the cinema to see the 15 rated movie. However, the fact that my teacher even suggested that we should see it has always stuck with me. Gladiator must have been, in his eyes, the most accurate portrayal of that era, as well as exceptional piece of cinematography; a solid plot, a superb cast, and enough blood and gore for your average 13-year-old (oops).
Unlike many of my friends, I was into history in a big way, so for me, demonstrating the weaknesses in a piece of written historical evidence through understanding the subtle societal, political, and institutional interplays of the time came naturally. For many of my friends, however, it didn’t, and they found it far more difficult to visualise events through the interpretation of language than I did. I could see, however, that when some visual medium was incorporated — be it watching a film or TV show, or even looking at pictures — this suddenly made things easier.
Is it about time that we use computer games in this way?
Computer games often attract scorn and derision among teachers and parents, especially when it comes to coercing students to study. They see very little value in the hobby of gaming, and are more inclined to see it as harmful in some way to their health or learning, relying on the studies of 90’s and 00’s that claimed to show correlations between gaming and any number of ills such as short attention span to violent outbursts and worse.
Games back then were in an infancy of their own. Yes, children might be a nightmare when they are young: excitable, pent-up balls of energy that have an inexhaustible curiosity and an unawareness of the trouble that they sometimes cause. But eventually they grow up. The gaming industry of the 90’s was itself an infant, lacking focus, skill, or precision, but attempting everything nonetheless. What else could explain the genesis of games like Grand Theft Auto, that were fun purely because of the two dimensional chaos that you could cause.
The ‘noughties’ saw games grow up somewhat. Mario was still on the scene, as Nintendo has always proudly embraced the adult’s inner child, but Sonic was gone, and games based around gore and carnage were, by and large, shunned. If they didn’t improve themselves in some way, either through graphics, playability, or storyline, then they would be buried in a deep desert grave somewhere in New Mexico with the other lacklustre gaming efforts before them.
But they did mature, in a sense. The characters in games were more fleshed out; the visuals exploited the larger RAM size; and the originality that was always possible began to flourish. Even fighting games, once found at the back of some shadowy arcade, like Tekken or Street Fighter, started making compelling character profiles and back stories. Games like Primal Rage that focused on dinosaurs and King Kong style apes beating each other to death, were no longer interesting.
There are so many titles to mention when considering how far games have come. Call of Duty would be such a game, making use of the emerging interconnectivity of players in cooperative missions that would have you screaming out of frustration down your headset at some dude living on the other side of the world — whereby an insult stream would ensue that would have you and your friends laughing for days.
Grand Theft Auto has also come a long way. When Grand Theft Auto: Vice City was released, gamers were instantly on side. It was more than the fact that this fictional city had characters and experiences in it that were hilarious and absorbing in equal measure. For example, the radio stations you would tune into that served up classic 70’s and 80’s tunes, made those car thefts even more important (and many people I’m sure have the vice city playlist on their Spotify — as do I).
A turning point for the gaming industry
Mainstream video games have been rocketing in popularity for more than thirty years, and in that time every technological advancement that could possibly hasten the improvement of the overall experience has manifested itself — the sound quality of speakers, headsets, computing power, Bluetooth, online purchases, and downloads, High-Def picture quality, the emergence of 4K, stronger streaming quality for video chat, and now the emergence and growing capabilities of both AR (Augmented Reality) and VR (Virtual Reality). All of these things have made it possible for more immersive, more believable, more creatively ground-breaking games to be developed.
It would seem that, in terms of both societal and economic popularity, gaming is now on par with those other mega industries like cinema and television, and is set to supplant them in the future. So why aren’t we even trying to bring them into schools in the way we have done with those other types of media?
Could Hong Kong be the first?
There is a fear, I think, of what the reaction to such a bold move as this might be, which is understandable. If a movie was playing on the class television or projector, the remote always remained in the hands of the teacher. They may not have controlled the reaction, but they did control the delivery mechanism. Computer games are about the student taking control of events, of placing themselves in the world of the avatar they are assuming and experiencing the narrative themselves. It means, therefore, that the teacher will have to cede some control for it to even be possible.
The other thing is the lack of familiarity teachers have with the medium. Computer games are not a part of any PGCE programmes and not necessarily something that everyone will encounter, unlike films, TV shows and even books. There is as yet no training as to how they could be employed, and if you have never used them before, you would be unlikely to want to bring them into lessons you are otherwise well-versed in.
Then there is the lack of any real catalogue. No one has taken the time to sift through the titles that are out there to find those that compliment existing subject syllabuses. This would itself be time consuming, as well as difficult to control. If, for example, only a particular portion is suitable, how is that portion alone isolated. Then there are issues with regard to parental guidance certificates and adjudicating age appropriateness, the inability to edit out controversial content and the ultimate lack of information as to the approaches effectiveness.
It all raises concerns. But this is placing the inconvenience of testing it out above the potential benefits that such insertion of gaming into syllabuses could bring. And who knows, students may not be fans of taking something that they enjoy doing as a way of unwinding after a long day of study and tainting it with the idea of educational worth.
Level up or game over: is it time we brought gaming into the classroom?
What do you think? Let us know your thoughts on this topic in the comment section below.