I've spent the week enjoying Dishonored, a Victorian-steampunk fever-dream of a game that came out earlier this month. In it you play Corvo, a bodyguard who's looking for revenge after being framed for a royal murder and kidnapping. So far, it's a lot more fun than I anticipated, in part because of its willingness to bite me in the ass for my choices. You can either carefully sneak by enemies or plunge right into the fray, slicing and dicing past. But if you choose the latter, you later have to confront the downed foes undead-style (a zombifying plague is a main plotpoint) and deal with a "darker" ending. Still, getting through is sometimes a lot faster with a weapon handy. Should I do the moral thing or the strategic thing? Is this even up to me?
I'm not the only one wondering. Scientists have just begun to ask these kinds of questions.
Dishonored isn't the first game to insert a moral choice system into it, but it's a relatively new idea--only in the past decade or so have designers started using the systems regularly. You make a choice--often starkly good or bad ones, although many have started to look at using moral gray areas--then deal with the results. This type of game will actually shift what happens in the story based on what you do in it. It can result in a completely different ending, or even more subtle: Maybe earlier you robbed a bar in a town, so when you come back, the residents give you the cold shoulder.
That's in contrast to most games, where you have a singular goal that fits into a single narrative, and you do everything you can to reach that endpoint. But this particular type of game, instead, forces us to confront our choices and their consequences. Now the science is starting to catch up to that method of game design, and some early research suggests we play with our real-life morals in these situations. But after we've done that, we might look at it and truly see a game, then play it to win.
Andrew Weaver, an assistant professor at Indiana University who focuses on media psychology, tested that idea. He's interested in decision-making in games because they turn "moral agency" over to the players. Unlike a movie, where you might feel bad or disappointed for a character you identify with who makes an immoral decision, the decisions in a game were made by you. When you make them, you feel guilty.
They felt "guilt for a bunch of pixels on a screen."To learn more about that idea, he sat a series of college students down and had them play the first act of Fallout 3, a post-apocalyptic game from 2008. That's a good choice: In that game, you start as a blank slate, literally from birth, then grow up and start making decisions. What Weaver found was that most people played the game as if it was real life. "People would say things like, 'I felt bad for character x," Weaver says. They weren't interested in injuring or stealing. They felt "guilt for a bunch of pixels on a screen." But how to explain the people who did play in a way we'd term "immorally"? (They might've killed a character in the game so they could benefit financially, for example, or made some other move that benefited themselves to the detriment of others.) Did they have a warped sense of the right thing to do? Nope, Weaver says. Those were the pros. They had experience with the game or similar games, so after already, presumably, making a run with the moral barrier intact, they could play strategically, making decisions that would make for a character objectively stronger, even if that raised the body count.
It's more complicated than a decision between an evil action and an angelic one, though. Some decisions include multiple factors. For example: An authority figure in the game might ask you to do something you felt was wrong. A player who valued authority over justice--determined by a survey done before they were sat down with the game--might submit to the pressure and listen to the order. But that didn't factor in the same way for people with in some ways a more typical goal: just beating the game. That doesn't make them bad people, just good players.
You don't have to think hard to refute that idea. What about Grand Theft Auto? What about Call of Duty? Who on Earth doesn't pick up those games just to cause mayhem? But Weaver argues those games don't offer you a real choice. You make "decisions" in lots of games--shoot this guy over here first, or this one?--but moral-choice games force you to take a good, hard look at what those decisions mean. In early games, Weaver says, designers might implement a naughty/nice bar that changed based on what you did, but that's not the same as changing the narrative of the game based on your choices. "It's not a moral decision so much as a technical decision to move the gauge," Weaver says.
So it goes with Grand Theft Auto. Take the violence in GTA (as plenty of media sources already have). The game is objective-driven--the idea is to complete a task, or mission, in an allotted time. The people aren't "real" people that the player has to make a choice about how to deal with; they're obstacles blocking a path to victory. That's if you choose to play that series by just completing the tasks and moving through the linear narrative. Of course, hardly anyone does that; GTA encourages you to go explore. Why isn't that a moral game, then?
Because GTA trades in so-called "sandbox"-style play--you're free to roam around and do whatever you want, but there aren't really consequences to your actions. If you decide to get a sniper rifle, head up to a rooftop, and pick off pedestrians until the army brings in tanks to take you down, you'll eventually die and start over. But you start over pretty much where you left off; it's not as if you now have to deal with the notoriety of being the Sniper of San Andreas for the rest of the game. And that simple fact is largely why there's so much violence in sandbox games like that. The option is there, and since it's devoid of consequences in the game, gamers push the boundaries, do whatever they want. Drive a car into a school! Who cares! It's more curiosity than anything malicious.
The same goes for the narrative in a lot of blockbuster first-person shooters: The characters are in your way, and to get to the finish line, you remove them. There's nothing wrong with that design-wise; it just provokes a different kind of play, where moral choices are largely irrelevant. The interesting thing is that gamers familiar with the structure of a moral-choice game, who have played it for awhile and are comfortable with its ins and outs, might begin to treat it as if it was an objective-based game. You case the joint, basically, and can concentrate on the more videogamey goal of simply winning.
What to pull away from this? People who make immoral decisions aren't doing because they're evil, or want to put on their super-villain costume for a few hours. That's one of the big things to understand, Weaver says. For the most part, we don't inhabit some virtual fantasy land when we play games. We understand it's a game, sure, but our morality is tough to override the first go-around. Given some time in the virtual world, however, maybe that changes.
As for Dishonored, I was more than content to make my way through violently at first. Why not? But when a loading screen informed me that there were consequences to my actions, I thought twice. I'm still not sure if it's because I felt bad, or was scared of the consequences if the game caught me with my hand in the moral cookie jar. But that's the essence of morality, right? Do we behave morally because we are compelled to as humans, or just because we fear punishment? Finally, games can tap into that element of humanity.
140 years of Popular Science at your fingertips.
Each issue has been completely reimagined for your iPad. See our amazing new vision for magazines that goes far beyond the printed page
Stay up to date on the latest news of the future of science and technology from your iPhone or Android phone with full articles, images and offline viewing
Featuring every article from the magazine and website, plus links from around the Web. Also see our PopSci DIY feed
Engineers are racing to build robots that can take the place of rescuers. That story, plus a city that storms can't break and how having fun could lead to breakthrough science.
Also! A leech detective, the solution to America's train-crash problems, the world's fastest baby carriage, and more.


Online Content Director: Suzanne LaBarre | Email
Senior Editor: Paul Adams | Email
Associate Editor: Dan Nosowitz | Email
Contributing Writers:
Clay Dillow | Email
Rebecca Boyle | Email
Colin Lecher | Email
Emily Elert | Email
Intern:
Shaunacy Ferro | Email
In Civilization 5 it took me some time to stop trying to build the scientific liberty loving society I want and start building the fascist theocratic militaristic state that fits my play style.
If any of the current games are tests for moral ground then the up and coming "DayZ" Game will be the one and true test. It's a zombie survivor sandbox where no teams or real game wide communication is available. The only way to gauge whether or not someone won't shoot you is to actually trust they won't shoot you first. The problem with that is that the other survivors actually benefit from killing you for your gear, but they lose out on having someone else with them to help them out. Chances are... especially if you already have others working with you, is that someone is probably going to get shot. I think POPSCI would benefit by having someone try out the game when it comes out, it's a real trip into the darkest parts of your mind.
And bit of a shame to not even mention Lionhead Studios and Peter Molyneux,pretty much the father of choice driven content in gaming. Unfamiliarity doesn't make something new.
in fallout 3, i put a grenade in the pocket of a child and was disappointed it wouldn't go off like with all the other npcs. I wonder what the professors would have thought about that.
They make the mistake of thinking everyone is playing to "Beat the game"
You cant just assume you know the players goals. For instance I often play a game just to explore the possibilities and admire the environment. Many people play games just to mess with other players. Haven't these researchers ever played board games?
(1) Many games reward behavior - meaning that players will often play to a type. Thus, one play through might be as the "good guy" and another as the "bad guy" to see all that the game offers. Consider FO:NV as players will often choose to play through twice: once per major faction. That reflects curisoity and role playing more than personal morality.
(2) Game morality tends to be highly subjective. Is killing bad guys good or evil? Is killing a zombie good, evil (selfish), or ammoral? Is draining the little emo girls in Bioshock merciful or cruel? Real life morality and game morality are often too different to relate.
(3) All you have proven is that the morality of games is not all that important to people who realize that they are just games. Congratulations. Where were you during the anti D&D days back in the 80's and early 90's?
@Cordeos I had the same exact problem.
I completely disagree, this just proves that you and the scientist don't know the point of ROLE PLAYING, as in, taking on a role. You see, there is no "pro" at a role playing games unless you say that the people who take on different roles like different clothing are the pro's. Feeling for characters in game is the first step to taking on that perspective, anyone who plays d and d could tell you this. The moral decisions are mostly their because it enhances one's ability to depict how your character acts towards the outside world.
"Want to Dominate The Game? Set Your Morals Aside"...Just like real life, then...