tools & methods
PROTOTYPING
UNITY
C#
GRAVITYSKETCH
TWITCH
OCULUS
collaboration with
the team
Sina Grebrodt
Amanda Wallgren
supervisor
Jenny Rodenhouse
background
This project was born out of four workshop sessions where we got to experiment with VR, Unity, and photogrammetry. Afterward, my teammate and I felt inspired and curious to explore how VR as a medium could enable solitude vs. the possibility for social experiences. We also were intrigued by what we defined as an “Iceberg phenomenon” - what is present in a VR scene vs. what can the person in the headset actually see?
outcome
OASIS consists of a scene where the VR participant is the observer. People participating through the live stream platform Twitch becomes the active players, enabling the scene, through inhabiting different game objects in the sky. By "miss-using" Twitch's commenting field the players steer the game objects over different trigger zones watching the scene unfold. Thus, the more people participating actively - the more crowded the scene will become. Consequently, the scene does not exist/diminishes without active input from people on Twitch. The VR-participant experiences the scene with no decision capacity, becoming the observer, which effectively changes familiar gaming roles.
my role
Throughout this project, my partner Sina and I worked closely together on every step. We created the concept together and took turns coding, testing, as well as building the scene in Unity.

Introduction to Oasis

02/03

Twitch plays OASIS

Switching the roles of the classic gamer vs. the participant

OASIS consists of a scene where the VR participant is the observer while the people participating through Twitch will be the active players enabling the scene, through inhabiting different game objects in the sky. By "miss-using" Twitch's commenting field the users steer the game objects over different trigger zones watching the scene unfold. Thus, the more people that are participating actively - the more crowded the scene will become. Consequently, the scene does not exist/diminishes without active input from people on Twitch. The VR-participant experiences the scene with no decision capacity, becoming the observer, which effectively changes familiar gaming roles.

VR participant's view

Twitch participants view

The project misuses Twitch’s commenting feature to interact with the ecosystem and VR-inhabitant.

Finding Trigger Zones

While steering the sky objects around, the Twitch participants will encounter different trigger zones that makes the scene unfold. The more people that participate the more populated the scene will become!

Moving far outside from the playing area will cause drought and a sandstorm will appear...

When plants grow tall enough the players will see birds appear...

After spending a long time in the scene the area will start to flood...

This project makes solitary VR experiences more social and unpredictable!

03/03

exploration

Point of interest 1: How does VR as a medium enable solitude vs. the possibility for social experiences?
Point of interest 2: Iceberg’ phenomenon - What is present in the scene vs. what can the person in VR headset see? Does the scene exist if you do not see it?
Point of interest 3: How does our knowledge of the fragility of the "real-real" world objects translate to how game objects exist in VR?

Initial prototyping in Unity to get started

The first goal was to make the cubes grow while another cube was passing over its trigger zone.

The next challenge was to make the cubes shrink while the cube, that we had now switched out to a cloud, were exiting the trigger zone.

When we had managed the interactions in Unity we connected the scene to Twitch - a live stream site commonly used by gamers.

Each cube was assigned names for left, right, forward, and backward in Unity. This enabled us to steer the cubes individually by using the Twitch chat.

Creating Game Objects using GravitySketch

Seeing the cubes grow and shrink we envisioned an ecosystem of fantastical-looking plants growing in an otherwise unpopulated desert. This became the inspiration for the aesthetics of the project.

Creating & modifying the scene in Unity

We later placed the models from GravitySketch into our Unity scene replacing the cubes. We tried a few different expressions, playing around with colors and light, until we found a look that we liked.

Stearing the Game Objects by using Twitch

The Twitch community has a very strong culture of using emojis (or emotes as they are called on Twitch). Therefore we decided to let the most popular emotes replace the words (left, right, back, forward) used to steer the game objects.

Testing the concept with others for the first time

When we finally had the project up and running we wanted to test it. Three people participated through the live stream on Twitch and one person was directly connected to the Unity scene using a VR headset. Our thought was to see how easy it was for the people participating through Twitch to steer the objects, as well as what the experience for the person participating through VR was.

Retrospective & Learnings

Working with virtual reality was completely new for both my project partner and me. We realized that working in real, 1:1, space vs. just creating an environment that one should access through a screen (which I was used to due to my industrial design background) required a new way of thinking and a new workflow. We developed a process where one of us would implement changes in the environment while the other person was in VR, observing the changes and giving instructions. I realize doing a project like this would be much harder and take a lot of time if one were to do it by oneself - since jumping in and out of VR can be time-consuming and there is a real value in seeing the changes made in real-time. Appreciation to my project partner Sina<3

The goal was to make an actual working prototype, rather than just a concept - and to finally be able to test it on people. Technically the Twitch channel could have been set up for people to use from home, however, the time ran out and we set it up in a lab environment where all participants sat close to each other. I believe the physical distance could have made a difference while trying the concept out. I also find it interesting to think about how OASIS might be used over time. I don't see the project as something that would constantly be interacted with, but as a room where people could enter and leave as they wish. Sometimes it might be completely empty when someone enters the scene while sometimes very populated. It would be interesting to see how this would change the experience.​

Finally, this project was the first time either of us coded something. This meant that most of the project was spent trying, failing, and trying again. In the end, this was really valuable and we learned a lot from this approach. It would be fun to do this project again and see how I would experience, or go about, this part of the project now that I have a better understanding of coding.