Games and bugs: forever together?
Video games are brilliant, the feeling of immersion you get from them is unlike any other form of entertainment. And the current technological advancements mean that the creativity and storytelling potential of the medium are practically endless. In short, it’s a wonderful time to be a gamer. However, this is not to say that all is perfect within the industry. Of particular concern is the seemingly ever-increasing frequency of games released at launch in a so-called “broken” state, one rife with glitches and bugs, essentially missing working features. But why is this happening?
Before proceeding further, it is worth noting that this article is in no way trying to claim that buggy games are an exclusively contemporary phenomenon. Doing so would mean having a rose-tinted view of the game industry. Bugs and glitches have been an inevitable feature of games since their inception decades ago. They are certainly an unwelcome one but form part of the package. No one disputes that. What I’m trying to focus upon is the reasons why this continues to occur in such a widespread manner, despite the technological advancements that should – at least in theory – make this less of a problem.
In fact, compared to the past, not only has the situation seemingly not changed, but some even argue that it has actually got worse, citing the messy releases of AAA productions such as Cyberpunk 2077, Assassin’s Creed Unity, Skyrim etc., and the ever-increasing reliance on remote software updates, whose aim is to fix issues post-game release. The latter appears to point to a lack of quality in games at the time of release because if the finished product were at an acceptable standard, there would be no need for them. The fact that their use is increasing only indicates that an ever-greater number of games are being released at a lower quality.
While it is true that in-game codes have become more complex, and hence so have the bugs and glitches they give rise to, I don’t think this is the whole story
Identifying whether this phenomenon is becoming more frequent is no easy task. One would have to account for the ever-rising complexity of games and for the fact that, as the general player base increases, so does the potential to swiftly identify issues within any video game. Furthermore, the presence of social media means that any identified issue can quickly make the rounds of the web and come to the attention of other players, something that in the past was not the case. In the past, most in-game issues remained in-game, and there was no way to share them with the rest of the world and provoke massive backlashes. In a way, players were blissfully unaware of the true buggy extent of the games they played.
Going beyond this, what really interests me is to understand why so many games continue to come out rife with bugs and broken features. While it is true that in-game codes have become more complex, and hence so have the bugs and glitches they give rise to, I don’t think this is the whole story. It’s easy to blame game developers. After all, they are those responsible for the production of the game, those in charge of crafting it and of making sure the end-product is acceptable. Yet, while they certainly bear responsibility to some extent, they are not the only agents involved.
Developers themselves ultimately have to answer to the heads of their studios (including the shareholders), whose main concern is the economic value of the games. This was already a big preoccupation back in the day, but it has only become ever-more so as the gaming industry has grown in scale and resources, becoming an enormous and complex machine. Nowadays it involves big corporations and major studios, which bet everything on the success of their games. Hence, dates of release are carefully planned in order to maximise profits and minimise competition, and even slight delays run the risk of sending events spiralling, forcing changes in marketing campaigns, losses in expected profits, delays of subsequent releases and so on.
Developers feel the pressure to release games by the planned date, even if they are far from finished
The gaming environment has developed to become hyper-competitive, one where major hitches are just not allowed, as the fallout can be disastrous, causing millions to be lost, studios and jobs to collapse, and more. Developers feel the pressure to release games by the planned date, even if they are far from finished, as the drawbacks are still less than if the game were not released till later. As a consequence, many games are released bug-ridden, as developers do not have the time, budget, or workforce (teams are often small compared to the sizes of the projects) to comb through the code and fix its problems. It’s far from ideal, but it’s the way things are.
All this does not mean that a game released in a broken state is destined to remain so. As mentioned earlier, developers routinely take advantage of the ubiquitous presence of the internet among video game players, which enables them to easily and remotely download game patches (something hardly possible in the past). In this way, improvements can be provided on a rolling basis. Again, while not ideal, it is far better than a perpetually unbroken and sub-par game.
Ultimately, despite the possibility to remotely download patches, for me the issue remains that bugs and glitches take away that feeling of immersion I treasure so much in video games. Bug encounters are a stark reminder that it is all an illusion when video games should precisely have the ability to make you feel part of the story. Even if they are later fixed, they still taint one’s experience of the game. And that’s a pity, both for the player and those behind the game, who certainly would rather wish that their games were fully enjoyed and appreciated, rather than be an object of frustration, or perceived as simple cash-grabs.