Technology has been evolving at such a rapid pace in the past decade or so, that the period since the 80s has been dubbed the revolution of information and telecommunications, with no signs of slowing down. Computers have gone from the size of microwaves to iPhones, getting quicker and more powerful with each model released.
The developments have been really quite remarkable and have transformed entire industries and businesses immeasurably. So much so that coding has been brought into the national curriculum even for children as young as five!
While this is excellent in pushing technology further, this decision has left adults and those without the knowledge of programming stumped. There is now a debate as to whether everyone should be learning to code, with one argument being that not doing so now will lead to the equivalent of being innumerate or illiterate in years to come.
While this could well be true, should everyone really learn to code? Those working in hospitality and industry sectors not related to technology could still arguably benefit as it is true that even our phones work on a level of programming. But learning to code is not going to be some quick solution, it will take time and probably a lot of money that most people just don’t have.
Sean Blanda argues that “The smartest workers will be able to leverage technology to their advantage and be able to recognise the big-picture ways to utilise it” and makes the important point that technology will change. So rather than everyone rallying to learn how to code without any real reason (yet), perhaps we should just develop an understanding of its uses. Especially because, at the pace of change we are seeing already, coding could even be out of date within ten years.
It is as Peter Wayner of InfoWorld states: “If hitting a target is hard and hitting a moving target is even harder, then creating a new hit technology is next to impossible because the shape and nature of the target morphs as it moves”.
In Brighton where technical development is rife, with web development and programming companies popping up everywhere, it seems that such change is unavoidable. But in less technologically developed locations there is not such a drive to push innovation. For millennials such as me, chances are that learning how to use these technologies is going to be essential in jobs moving forwards, particularly in tech hubs and bigger cities as I have mentioned. Yet for those working as, say, shop assistants or waiters, will it really be necessary? They may be expected to know how to use internal programs through which they will be trained, but they will not be expected to create them.
In much the same vain that GPUs are vastly becoming the next CPUs, Peter Wayner also highlights that it will soon become possible to write code and let the GPU decide when it can run effectively. While this may seem particularly advanced to some, it is already being done at a basic level, particularly in academia. As technology develops further this could easily become standard procedure much like wearable iPhones have, thereby eradicating the need for all of us to have learnt code.
I understand the ethos of the argument that we should all learn to code. Technology is developing rapidly and already those that are unfamiliar with using smartphones or cloud services are being left behind. However, I firmly stand by the arguments of Blanda and Wayner that it is not wholly necessary for us all to learn how to code. Perhaps familiarising one’s self with code to a basic level as is being promoted in universities and local organisations yes, but learning to code an entire program is not practical or justifiable with the changing pace of technology.
I am a young tech-obsessed Brit living in the UK (Brighton), interested in sport, travel, gadgets & software