Are morality systems making us immoral?
Morality systems in gaming are a huge trend at the moment – there’s something fundamentally fascinating about the ability to make a choice with far-reaching consequences, enabling you to explore what sort of person you are. But, on the whole, morality systems are poorly realised, and morality can come second to strategic gameplay or the fun of being evil without the real-world consequences, and so it begs an obvious ethical question – does the use of morality systems in video gaming encourage immorality?
What does it mean to be moral? Essentially, you’re mindful about the choices in life that have good or evil consequences, but the question of morality is one that has long troubled philosophers and ethicists. How exactly can one qualify right and wrong, or good and evil? Well, gaming developers have tried, with incredibly mixed results.
The most typical option is offering gamers a binary choice – the ‘good’ and ‘bad’ option – and offering certain bonuses or different endings, depending on which you choose. There are some issues with this approach, not least that most moral choices are not simple binary decisions. If moral choices are linked to bonuses, they have to be easily solvable, and what ought to be a moral issue instead becomes a strategic one. You may not want to commit a moral dubious act, but imagine if it makes the gameplay easier, or affords you a weapon you couldn’t otherwise obtain – the moral then comes second to gameplay.
If moral choices are linked to bonuses, they have to be easily solvable, and what ought to be a moral issue instead becomes a strategic one
In Star Wars: Knights of the Old Republic, the morality system measures your devotion to the Light or Dark side of the Force, with your actions and dialogue choices shaping your alignment. However, the system actively discourages you from making ethical choices because sticking with your initial choice results in gameplay benefits (cheaper Force powers, for example). If you choose the Light side, and then make some Dark side choices because you think they’re the best option, the only result is making the game harder for yourself.
This may seem a fairly obvious example, and one that is easy to pick apart, but even more nuanced morality systems have their faults. Take Dishonored, a game that allows you to undertake a number of assassination missions and complete them in a number of ways. The option exists to eliminate all of your targets in a non-lethal way (the ‘good’ option), but often the end result of these choices is not much morally better. Rather than killing Lady Boyle, for example, you can choose to deliver her to her stalker – this result, which is coded as ‘low chaos’ and therefore the more ethical route, feels morally worse than just killing her, and yet the game tells you that you’re a good person for allowing her to live. How does the player engage with a morality system in which both the ‘good’ and ‘bad’ options are bad?
Equally as bizarre is the morality system of Frostpunk, a steam game in which you rule the last society on Earth. One of the questions in the game is how you reconcile pursuit of the greater good (saving your society) with the suffering you cause your people. You need to power a giant generator to stave off the endless winter, and you sign laws to do this, all of which carry a heavy moral toll (like forcing children to work in the coal mines) – you’re also able to sacrifice a child to buy everyone else more time, and the game is difficult enough to make this worth considering. However, you can do this and still be considered ethical – the game’s morality system works by adding up all the evil decisions you make and, if you haven’t made too many, it considers you good. The maths makes the morality system little more than a difficulty setting – you can make hideously evil decisions, but you’re good if you don’t make too many.
It doesn’t boil everything down to a simple good ending/bad ending, because morality is more complex than a simple binary
Perhaps the most layered morality system is found in Papers, Please, with puts you in the role of an immigration agent and essentially the moral arbiter of Arstotzka. You get to choose how you become corrupt and to what extent, but it forces you to be complicit in your character’s moral actions – you may wish to save as many people as you can, but the game will only allow so many before it becomes a threat to you, and so you must choose who. It doesn’t boil everything down to a simple good ending/bad ending, because morality is more complex than a simple binary.
Moral systems in games are necessarily reductive, and there’s normally a right way to interact with them – a fairer way would be presenting various options, and allowing the player to judge whether what they did was right or wrong. As it stands, moral options are often secondary to gameplay convenience, and the frequently bizarre way they can decide what is morally good can lead to a troubling view of morality. When the moral options are so bad, why would you not go the whole hog and be immoral?
Comments