November 27th 2022
As computers take one of the most prominent roles in our daily lives, it is imperative that we reflect on the current state of our software to ensure it keeps pace with the rapid advances in hardware. Most of the popular programming languages and protocols build off of specifications are decades old. Though its true that these languages have continued to improve, receiving regular and significant updates, reflecting back on the core design philosophies of the time and the modern programming landscape, there is room for innovation.
In the late 20th century and early 21st century, programmers were able to maximise the performance of the code with respect to their hardware using fundamental programming languages such as BASIC, C, and Fortran. This led to the invention and implementation of stunning algorithms such as the fast inverse square root algorithm in Quake, a video-game released in 1996, or the FFT (Fast Fourier Transform) algorithm used for compression of 3D animation data in Crash Bandicoot - even predating the invention of the PNG format which was a breakthrough in compression technology. As of late, however, one can't help but get the feeling that the innovation driving this kind of "perfect" computer resource exploitation is beginning to fade. Nowadays, computer resources are rarely an issue in developing common applications, and as such the need for future innovation in software development seems scarce. Of course, this is not the full picture. Many applications that require this type of problem solving to solve various computing problems in the world such as in graphics programming and simulation (a recent example of which is the algorithms used to model the spread of COVID-19 in supercomputers). As such, in the future we need to
Increasingly, programmers have become more and more obfuscated to the core metal of computers. Where one generation of programmers may have wrote the commands to directly access the memory and set the raw buffers of the computers in assembly, the next may create call upon the code of the last creating their own frameworks built around it. This process has greatly increased accessibility and allowed more people to enter the field, smoothing the otherwise steep learning curve, and allowing for variability in the level of control we have as programmers - how far we want to plunge into the low-level abyss of abstraction. These continually expanding frameworks have allowed for programmers to produce code rapidly, however it comes at the expense of potentially significant inefficiencies. The more this cycle continues, the fewer people will understand how the core of the computer works, and how to take advantage of its resources. Though these overarching changes are mostly benefitting us, it is important to avoid over-dependence and stagnancy on old lower-level code.Â
Everyone just assumes: "someone far more experienced at programming than I must have implemented, so it must be faster than anything I could implement", but sometimes this is not the case. When designing Quake, developers at id software developers could have contended with the traditional mathematical implementation of the inverse square root (as included as a standard library in C), giving into a similar vein of thinking, but instead they looked deeper into the very binary nature of the computer for the sake of optimisation. They knew that, hidden in the very way the computer stored it's memory, they could exploit it's place value system to calculate this vital function used in resolving luminosity from light rays. I find this type of thinking to be at the core of what I think makes the art of programming a truly beautiful practice. There is a joy in the discovery of an efficient algorithm, similar to that experienced by a mathematician or an artist, and there is solace knowing that a program runs at maximum efficiency, down to the very architecture of the computer.
Part of the solution, I feel, are new languages which focus on novel ideas in their implementation and which build on previous generations to make them more intuitive and easier to understand. With a more diverse range of languages, we can adopt more suitable frameworks for there own unique application and continue to consider changes.
Though innovation comes slowly, change is coming. Apple recently announced that it will be deprecating OpenGL as the main underlying framework to create graphical applications for macOS. Instead they have created their new own framework in the form of "Metal". Developers at Khronos are developing a modern cross-platform graphics framework to replace OpenGL in the form of "Vulkan". Many modern game engines are already adapting these new technologies to improve performance and adapt new technologies in what has been a largely stagnant in recent years.