3. Final Project – Prototyping

This blog post will explore the ways in which I have attempted to learn the workflow behind Environment Design, as well as the way in which I have experimented with VR and its features, and the obstacles that I have faced.

When it came to starting the work on the artefact, I started by importing some of the assets that I had planned to use, which I had previously downloaded. Thus, the first problem appeared, because while I had gathered a large quantity of assets, the oversight on my part was that the assets were made for mobile games and did not have 2-sided polygons for the foliage on the trees in order to save resources. 

This posed a large problem because, although they looked like they functioned from above, from below the foliage on the tree was invisible. My project was planned to be experienced from a first person perspective, being close to the ground, due to it  being in VR, and as such the assets found were not useful for my intended purpose. I  initially thought that since the assets were made for mobile, they would maximise the frame rate for the experience, since it is a very important aspect when developing for VR, as according to IrisVR (no date), it requires at least 90 frames per second for users to feel comfortable, with slight changes depending on the hardware used, as it otherwise would lead to them becoming nauseous or disoriented. However, as I later came to realise, these assets were made to be looked at from a distance, in a top-down view, whereas my VR project would have these assets very close to the player, since it is a first person experience.

Thus, I had to come up with a new solution, and started searching for a new asset pack. The initial assets being problematic lead to me having to look for better alternatives, and came across a spline-based asset creation tool for natural environments, which can be seen below (Fig.11). 

Fig.11 An example environment constructed using the Procedural Nature Pack
PurePolygons (2015) Procedural Nature Pack Vol. 1 Available at: https://www.unrealengine.com/marketplace/en-US/product/procedural-nature-pack-vol 

These assets were spline-based and had procedural textures, which meant that I could easily reuse these assets without having to worry that they would be obviously repeated, which could have prevented the spectator from being immersed. Although looking much better than the previous set and having leaves and branches animated, I wanted to have more options to choose from.

After looking through the Epic Games Marketplace for more assets, Quixel Bridge came to my attention. With its suite of photoscanned, game-optimized, free-to-use Megascans, it was perfect for the project. The reason why photoscanned assets are ideal for this project is because these assets look very close to how the objects would look in real life, being used by AAA games such as Metro Exodus (Fig.12) (Umar-Khitab, 2019). This will help because it will increase the chances of the player experiencing sensory immersion, which may in turn create a more intense emotional payoff within the user at the awe-inspiring moment.

Fig.12 Level from Metro Exodus – Taiga – using Quixel Megascans Assets
4A Games (2019) Metro Exodus [Online/Disk] PC/XBoxOne/PS4. Deep Silver

Sjoerd De Jong’s video on volumetric fog (Unreal Engine, 2018) and light shafts has helped me better understand how these work within UE4. These light shafts, often called “God Rays”, are widely considered to be beautiful, which may be partly due to the fact that light is generally a source of wonder within a user, but it may also be tied to people associating them with divinity, as suggested by their nickname. As Allen’s research (2018) suggested, awe is often associated with religion and is considered to be a religious, “paradigm-shifting” experience by many. According to Hildebrand (1999), light plays a very important role in driving one’s attention toward prospect. Thus, using light shafts with consideration may significantly improve the emotional reaction of the player. This will be explored further in future blog posts.

Another critical part of my research has been Virtual Reality. Specifically, how to design a game or an interactive experience for VR, as well as more technical aspects of using the Oculus Rift DevKit 2 in Unreal Engine 4, such as camera placement and specific in-engine settings. 

As Harmony Studios (no date) mention in their blog post on VR Design “Golden Rules”, the points of interest need to be identified early on, as the screen is no longer a 2D rectangle with specific bounds, and the designer needs to create the scene with that in mind. The peripheral areas (Fig.13) need to be given major attention, as this is one of the more important features that separate VR experiences from Desktop Display experience. They can be used to guide players towards their next goal or point of interest. The “No Go Zone”, as called in the article, is the area that the designer should avoid placing important elements of the experience in, unless it is critical that the player specifically makes a sudden 180 degree movement for a more dynamic experience, as they otherwise may have a hard time to keep being immersed in the environment.

Fig.13 A visual representation of a human “view cone” from a VR importance perspective
Harmony Studios (no date) The Golden Rules for VR Design, Available at: https://www.harmony.co.uk/golden-rules-vr-design/ 

Reading on the technical aspects of VR has also proven to be very useful, as the Unreal Engine 4 documentation gives a multitude of helpful tips in regards to how to best set up a camera depending on the player’s intended position, what options to use in order to minimize the chances of motion sickness and how to make the best use out of the available equipment, which in this case, was an Oculus Rift Development Kit 2.

For example, in order to avoid Simulation Sickness, there is a target Frame Rate for each system’s capabilities (Fig.14). In the case of this project, 75FPS were required. In order to best achieve these Frame Rates, a set of recommended settings and general design tips were shown, such as removing Walking Head Bob, avoiding any type of camera shake, as well as strong, vibrant colors and lighting, especially if combined with motion blur or depth of field. Additionally, it is recommended to avoid stairs or generally moving faster than the regular walking speed of a human, as these can both disorient the player. It is instead recommended to use lifts where possible and respectively to maintain a steady acceleration speed.

Fig.14 Table of Target Frame Rates depending on the Oculus Rift Device
Unreal Engine (2017) Oculus Rift Best Practices, Available at: https://docs.unrealengine.com/en-US/Platforms/VR/OculusVR/OculusRift/BestPractices/index.html 

The camera settings are quite different from regular UE4 setups as they vary depending on the player’s position. As seen in Fig. 15, if the player is seated the camera’s original position needs to be artificially changed in order to accommodate for the system’s calculations, so that when the player is seated the camera is still elevated to a normal altitude.

Fig.15 Camera setup for Seated position
Unreal Engine (2017) Virtual Reality Best Practices, Available at: docs.unrealengine.com/en-US/Platforms/VR/DevelopVR/ContentSetup/index.html 

If the player is standing, the initial position of the camera needs to be exactly at ground level for the system’s calculations to work properly, as seen in Fig. 16.

Fig.16 Camera Setup for Standing position
 Unreal Engine (2017) Virtual Reality Best Practices, Available at: docs.unrealengine.com/en-US/Platforms/VR/DevelopVR/ContentSetup/index.html 

After conducting all of this research, one major issue has arisen: accessing the necessary VR resources and playtesters became impossible due to unprecedented circumstances. Thus, from a VR focus, the project had to take a different turn and as such, was transformed into a First-Person experience, using a Desktop Display. While it is true that, according to Pallavacini and Pepe (2019), playing a game while using Virtual Reality elicits more intense positive emotions and that the sense of immersion and flow is greater than if the same thing would be experienced by using a regular desktop display, the same study showed that the fulfillment of psychological needs is independent of the display method, and thus, would not pose a critical risk to the project at hand. While the aforementioned VR studies have suddenly become of no particular use to the current project, the information remains valuable.

Reference List:

Leave a comment

Design a site like this with WordPress.com
Get started