I've spent the week enjoying Dishonored, a Victorian-steampunk fever-dream of a game that came out earlier this month. In it you play Corvo, a bodyguard who's looking for revenge after being framed for a royal murder and kidnapping. So far, it's a lot more fun than I anticipated, in part because of its willingness to bite me in the ass for my choices. You can either carefully sneak by enemies or plunge right into the fray, slicing and dicing past. But if you choose the latter, you later have to confront the downed foes undead-style (a zombifying plague is a main plotpoint) and deal with a "darker" ending. Still, getting through is sometimes a lot faster with a weapon handy. Should I do the moral thing or the strategic thing? Is this even up to me?
I'm not the only one wondering. Scientists have just begun to ask these kinds of questions.
Dishonored isn't the first game to insert a moral choice system into it, but it's a relatively new idea--only in the past decade or so have designers started using the systems regularly. You make a choice--often starkly good or bad ones, although many have started to look at using moral gray areas--then deal with the results. This type of game will actually shift what happens in the story based on what you do in it. It can result in a completely different ending, or even more subtle: Maybe earlier you robbed a bar in a town, so when you come back, the residents give you the cold shoulder.
That's in contrast to most games, where you have a singular goal that fits into a single narrative, and you do everything you can to reach that endpoint. But this particular type of game, instead, forces us to confront our choices and their consequences. Now the science is starting to catch up to that method of game design, and some early research suggests we play with our real-life morals in these situations. But after we've done that, we might look at it and truly see a game, then play it to win.
Andrew Weaver, an assistant professor at Indiana University who focuses on media psychology, tested that idea. He's interested in decision-making in games because they turn "moral agency" over to the players. Unlike a movie, where you might feel bad or disappointed for a character you identify with who makes an immoral decision, the decisions in a game were made by you. When you make them, you feel guilty.
To learn more about that idea, he sat a series of college students down and had them play the first act of Fallout 3, a post-apocalyptic game from 2008. That's a good choice: In that game, you start as a blank slate, literally from birth, then grow up and start making decisions. What Weaver found was that most people played the game as if it was real life. "People would say things like, 'I felt bad for character x," Weaver says. They weren't interested in injuring or stealing. They felt "guilt for a bunch of pixels on a screen." But how to explain the people who did play in a way we'd term "immorally"? (They might've killed a character in the game so they could benefit financially, for example, or made some other move that benefited themselves to the detriment of others.) Did they have a warped sense of the right thing to do? Nope, Weaver says. Those were the pros. They had experience with the game or similar games, so after already, presumably, making a run with the moral barrier intact, they could play strategically, making decisions that would make for a character objectively stronger, even if that raised the body count.
It's more complicated than a decision between an evil action and an angelic one, though. Some decisions include multiple factors. For example: An authority figure in the game might ask you to do something you felt was wrong. A player who valued authority over justice--determined by a survey done before they were sat down with the game--might submit to the pressure and listen to the order. But that didn't factor in the same way for people with in some ways a more typical goal: just beating the game. That doesn't make them bad people, just good players.
You don't have to think hard to refute that idea. What about Grand Theft Auto? What about Call of Duty? Who on Earth doesn't pick up those games just to cause mayhem? But Weaver argues those games don't offer you a real choice. You make "decisions" in lots of games--shoot this guy over here first, or this one?--but moral-choice games force you to take a good, hard look at what those decisions mean. In early games, Weaver says, designers might implement a naughty/nice bar that changed based on what you did, but that's not the same as changing the narrative of the game based on your choices. "It's not a moral decision so much as a technical decision to move the gauge," Weaver says.
So it goes with Grand Theft Auto. Take the violence in GTA (as plenty of media sources already have). The game is objective-driven--the idea is to complete a task, or mission, in an allotted time. The people aren't "real" people that the player has to make a choice about how to deal with; they're obstacles blocking a path to victory. That's if you choose to play that series by just completing the tasks and moving through the linear narrative. Of course, hardly anyone does that; GTA encourages you to go explore. Why isn't that a moral game, then?
Because GTA trades in so-called "sandbox"-style play--you're free to roam around and do whatever you want, but there aren't really consequences to your actions. If you decide to get a sniper rifle, head up to a rooftop, and pick off pedestrians until the army brings in tanks to take you down, you'll eventually die and start over. But you start over pretty much where you left off; it's not as if you now have to deal with the notoriety of being the Sniper of San Andreas for the rest of the game. And that simple fact is largely why there's so much violence in sandbox games like that. The option is there, and since it's devoid of consequences in the game, gamers push the boundaries, do whatever they want. Drive a car into a school! Who cares! It's more curiosity than anything malicious.
The same goes for the narrative in a lot of blockbuster first-person shooters: The characters are in your way, and to get to the finish line, you remove them. There's nothing wrong with that design-wise; it just provokes a different kind of play, where moral choices are largely irrelevant. The interesting thing is that gamers familiar with the structure of a moral-choice game, who have played it for awhile and are comfortable with its ins and outs, might begin to treat it as if it was an objective-based game. You case the joint, basically, and can concentrate on the more videogamey goal of simply winning.
What to pull away from this? People who make immoral decisions aren't doing because they're evil, or want to put on their super-villain costume for a few hours. That's one of the big things to understand, Weaver says. For the most part, we don't inhabit some virtual fantasy land when we play games. We understand it's a game, sure, but our morality is tough to override the first go-around. Given some time in the virtual world, however, maybe that changes.
As for Dishonored, I was more than content to make my way through violently at first. Why not? But when a loading screen informed me that there were consequences to my actions, I thought twice. I'm still not sure if it's because I felt bad, or was scared of the consequences if the game caught me with my hand in the moral cookie jar. But that's the essence of morality, right? Do we behave morally because we are compelled to as humans, or just because we fear punishment? Finally, games can tap into that element of humanity.
Five amazing, clean technologies that will set us free, in this month's energy-focused issue. Also: how to build a better bomb detector, the robotic toys that are raising your children, a human catapult, the world's smallest arcade, and much more.