Who’s The Killer

A VR Experience For Naive Guests

The first thing players see after the starting cutscene. This is where the guess of the murderer is submitted.

Pieces of a wedding photo. Players can find several of these around the room and assemble them in the frame.

A screenshot of the game where the UV flashlight is shinng on a few of the clues

Our nine suspects and their traits: gender, hair color, glasses, and birthmark. Only three of the four traits were needed to uniquely identify the killer.

— PROJECT NAME

Who's the Killer


— ROLE

Programming, Producer


— DATE

September-October, 2024

For the second two week round in the Building Virtual Worlds class at CMU, we were tasked with creating a VR experience for naive guests. This meant that during the final presentation, we could help the guest into the headset, but could tell them nothing about the experience, their goals, or the controls. Furthermore, we weren’t allowed to have in-game tutorials such as “Press A to Interact.” This was an exercise in indirect control – how do we guide players to both know what to do and make the decisions we want them to make? To test our ability to do this, we had to write a set of predictions about our guest’s actions in game.

We chose to create a murder mystery escape room. Guests play the role of a detective who's located a murder's apartment and is trying to determine the identity of the killer before they kill their next victim.


At the start of the experience, we faced the player towards a tablet with the faces of the suspects on them. The clues we hid around the room helped the player determine:

- the gender

- whether the suspect wore glasses

- the hair color

- whether the suspect had a birthmark

There were also other clues that hinted at the motive, though discovering these weren't necessary to complete the experience.



Indirect Control


We used a several methods of indirect control to guide our player. We start them off facing the tablet where their final decision is submitted. This helps them orient themselves within the room. One of the clues was an answering machine with a flashing red light and a large red button. This helps the player realize it was important. On a letter found in a safe, we underlined important information. After picking up the UV flashlight, footprints were dimly visible on the ground, making the player think to turn off the lights to see them better.



Managing Difficulty


We did a few additional things to help keep the difficulty low. If this were a game that someone would try multiple times, high difficulty would be okay; however, we had one shot to demo this game to our class, with one unknown naive guest. Because of this, we did our best to make sure they'd have all the information they needed while still feeling like they were making discoveries and figuring things out. After all, since this is a game where the guest plays as a detective, if we don't make them feel like a detective we've failed in some way.


First, we repeated information in several clues in case the player missed any clue. For example, to figure out the gender of the killer, the player could either listen to the message on the answering machine or read the letter in the safe to determine the killer's name and gender.


Second, while there were four distinguishing factors between our suspects, only three were needed to uniquely identify the killer. This meant that if a player didn't figure out something or make the connections we wanted them to, they could still win.


Third, certain things were able to be picked up, and others weren't. This helped give players a sense of what was important to the mystery.


Finally, we made a safe that contained one clue and some story information. The code to the safe was three digits long, and there were two ways to find the code. Either the player could piece together a photo and shine the UV flashlight on it, revealing a date that could be interpreted as a three digit code, or they could look at the safe with the flashlight, revealing fingerprints on the code buttons. The three digits highlighted on the safe were 2, 6, and 7. We didn't want someone to have to try all six possible orderings of these buttons because this might lead to frustration and giving up. Thus, we made three of the six codes work to open the safe.


One interesting discussion in our group revolved around the UV flashlight. It makes sense for the room lights to be off to make the UV flashlight more impactful and visible, but the lights being off also makes the other clues harder to see. One of my team members thought that the lights should turn off automatically a few minutes in, as a way of adding interest and making the flashlight work. I thought this took too much control away from the player (if someone solved things really quickly, they'd just have to wait until the lights turned off. If someone was slower at solving things, the lights being off would stop them from seeing the other clues), so I vouched for keeping the lights on the whole time. We playtested both and neither worked well. We came up with the idea of a light switch, which satisfied both the interest and the control aspects.


Building for VR


I had never built something for VR before, so this was an interesting experience. We used a Meta Quest 3 and the Meta SDK for Unity. The SDK had good demos and documentation so I found it pretty easy to pick up. There were differences when running our game on the headset directly vs in Unity wired to the headset, but playtesting helped us identify and resolve these differences.


Because we were building for naive guests who we couldn't explain controls to, my group chose to use hand tracking instead of controllers. We tried to make all the objects intuitive to interact with, such as by picking them up, poking, or grabbing and opening. One downside of this approach is that there is a limit to the quality of the Quest 3 hand tracking. In particular, tracking is lost when someone moves their hands out of sight of the camera. This occurred most for us when people picked up our flashlight object and then tried to look around the room, and when people tried to flip the a light switch without looking. Given our time constraints, we chose to ignore this. We made objects unaffected by gravity so that they would be easier to move around, especially if hand tracking was lost. This turned out to work well. Through playtesting we found that hand tracking didn't cause many issues.


One thing that's interesting about VR compared to the real world is that nothing is really tangible in VR. If we don't prevent it, people can move their head to look through any object or reach through closed doors. If we prevent this by stopping the camera or hand models, the physical/virtual misalignment can cause nausea or break immersion. On the other hand, if we don't prevent it, players can bypass puzzles by reaching through closed cabinets or looking through our safe. They can also become frustrated by picking up objects inside a container when they're trying to open the container. My solution to this was to disable and hide all important objects within containers and the safe until they were opened.


An unexpected challenge I faced was implementing a light switch. It was easy to code a function that turned on and off the room lights we wanted. The hard part was making something that felt intuitive in VR. Initially, I added a collision point to each pointer finger and the top and bottom of the switch to let players push the switch into position. After watching people try this though, I saw that a few people tried to flip the switch with bent fingers, and others tried to pinch the switch to move it. I tried several ways of making the switch usable while feeling real, but in the end decided to simplify. I realized that the most important thing was the switch working on the first try. If a player tried to flip the switch and nothing happened, they might think it's just background decoration and never touch it again. The final version of the switch automatically flips when the hand gets close enough. It's not realistic and doesn't feel as satisfying to use, but it works more reliably.


Playtesting


To make sure our difficulty level was good, our story was understandable, and the controls were intuitive, we had to playtest. Part of my job as producer was to set up playtesting, take notes, and interview the playtesters. We had 12 playtesters try the game in the same setting as during our final presentations, with the same setup, as well as an additional 5 or 6 during our interim presentation. We had four different playtesting sessions, and we iterated based on feedback between those sessions. Overall, I'm very happy with the way we playtested. It gave us a lot of confidence in our predictions, which was validated by the fact that the naive guest during our final presentation acted almost exactly as we expected.