Scale Worlds


Unity 3D







Designer + Unity Developer + Manager


Create an immersive virtual environment, where students can see various scientific entities in relation to their own bodies and conduct realistic size comparisons that cannot be replicated in everyday experience.


Dr. Matthew Peterson — Design Professor

Dr. Karen Chen — Human Factors Engineering Professor

Dr. Cesar Delgado — STEM Education Professor

Linfeng Wu — Human Factors Engineering PHD

Tyler Harper-Gampp — STEM Education PHD

Amanda Williams — Design Technician

Rebecca Planchart — Design Technician

Meghan Jack — Design Technician

Elizabeth Chen — UX Designer

Description of work

Scale Worlds is a virtual learning environment which enhances participants’ conception about size and scale. It has been funded through a National Science Foundation award titled Virtual Reality to Improve Students’ Understanding of Scale in STEM.

For two years I’ve worked under the guidance of Dr. Matthew Peterson and in close collaboration with professors and doctoral students from the Human Factors Engineering and STEM Education departments to design and develop this scale cognition learning environment.

In this project, I have also acted as manager for three hourly design technicians.


I have been highly involved in this project, taking on diverse tasks including drafting design documents, planning out the user experience, coding in C#, 3D modeling in Blender, and developing in Unity. I have assisted in running two rounds of qualitative usability studies. Results from the first round have been published in a Human Factors journal. The majority of our design decisions have been made in response to the usability studies or are backed up by literature on the subject. Alejandra Magaña’s Framework for Size and Scale Cognition has been an invaluable resource during our design process.

Starting point in Scale Worlds (formerly a surgeon)


When I joined the project, the team of professors had secured funding through NSF partially due to a prototype created by Grace Wonaphotimuke.

This project involves designing for two VR platforms: the Cave Automatic Virtual Enviornment (CAVE) and Head Mounted Display (HMD)




When designing the new version of the environment, the scheme was split up into 3 main sections

user interface: flat interactive elements

armatures: three-dimensional things which aren't entities

entities: three dimensional things like animals, stars, atoms, cells, etc. which are used as landmark things to measure

The way in which the user interacts with the environment was designed to mimic changing an exponent in scientific notation and moving decimal places in standard notation: concepts seen in American science and math curricula

decimal move animation

how the user interacts with the numeric panel

A variety of enivronment schema were designed in two or more ways and then put through user testing with user interface experts. Later testing will be done with general "non-expert" students.

the forest

the path

User research revealed a path forwards as well as opening up a conversation about "usability vs theory".

professors in the CAVE

HMD Version

The HMD has become very different from the CAVE version. Affordances and conventions of HMD vs CAVE has made for some really compelling research.

controller diagram for HTC VIVE

discussing the transition from CAVE to HMD

UX user journey for HMD

Doing: User appears on a platform and sees the height decleration space in the distance.

Thinking: “What is that?”

Doing: User learns how to teleport by teleporting to the height declaration space. Along the way, they discover that they can only teleport to places when the green square appears on the platform.

Thinking: “If I hold down the trackpad button I can teleport anywhere on the platform as long there’s a green square!"

Doing: User notices the measuring sticks and the height circle that indicates where to place their height stick.

Thinking: “I think I’m supposed to grab my height and drag it into the dashed circle...”

Doing: User looks at their controllers and notices the grab/select overlay has appeared on their controller. They are prompted to grab their height and drag it to the placement location, noticing that the color of the height marker changes to its hover state.

Thinking: “Oh, I can use the trigger to grab and drag my height to the placement circle.”

Doing: User grabs their height and drags it to the placement location, learning the grab and drag functions.

Thinking: “I can hold down the trigger to grab and drag entities.”

Doing: After dragging their measuring stick into the height circle, the other sticks disappear. In the same second, the HUD numeric panel loads showing scientific notation, then standard notation.

Thinking: “This must be tracking my height. Because I am 1.7 meters tall.”

Doing: User notices the pink home platform in the distance and large text behind it prompting the user to press the home button.

Thinking: “Home? Let’s see what happens if I press the side button…”

Doing: User presses the home button and is teleported to the pink platform. They are now in the scaling environment.

Thinking: “Sweet, now I know I can teleport home seamlessly.”

Doing: User notices the entities in the scaling environment. User also notices a blue ruler next to the human which documents the user’s height. There is a bridge in the distance.

Thinking: “Wow, that whale is huge! I must be the height of the blue ruler next to the human since I selected that same thing before.”

Doing: User is prompted to scale up (grow) to the size of the whale. The user presses the resize button while pointing their controller up.

Thinking: “If I point up and press the top button, I can grow to be the size of the whale!”

Doing: The user has grown to the size of a whale, and notices that the units on the numeric panel have been updated to reflect their new size.

Thinking: “WOW, I’m massive! The number on the panel changed to reflect my new height. The human looks so small now.”

Doing: The user looks left and notes the information panel. They understand that this panel communicates the perspective of the entity they to which they have scaled.

Thinking: “I’m seeing things from the perspective of a whale! Oh wow, I didn’t know right whales are baleen whales.”

Doing: The user turns back to the entity and decides to teleport closer..

Thinking: “Let’s get closer.”

Doing: After the user teleports closer to the whale for inspection, they notice the stacking prompt.

Thinking: “I wonder what would happen if I dragged the human entity above the stacking spot.”

Doing: User grabs the human entity and drops it above the stacking spot.

Thinking: “I wonder how many humans long this whale is...”

Doing: After the human entity is placed in the stacking box, the stacking box stretches to the width of the whale and generates X amount of humans.

Thinking: “Wow, it looks like 10 humans comprise the width of a whale!”

Doing: The user look to the right and notice the unit panel, which acts as a metric key.

Thinking: “So this is where I can relate these units I keep seeing.”

Doing: In the distance, the user sees the scenic view location and decides teleport there.

Thinking: “Let’s check out the scenic view.”

Doing: The whale-sized user teleports to scenic view and reads the scenic view message.

Thinking: “Looks like I can return to human size if I use the progress panel.”

Doing: User turns around to look at the progress panel, which shows the silhouettes of all SW entities to which they’ve previously scaled.

Thinking: “Maybe the other entities will appear here once I shrink past the human.”

Doing: The user uses their controller to hover over the human silhouette, noting the hover state color change. They select the human and shrink to the size of a human.

Thinking: “I can only select human. I guess it’s the same trigger button as grab?”

Doing: The user points down and presses the resize button to shrink, which scales them down to the size of a bird – the type of bird unbeknownst to them.

Thinking: “I’m so tiny! I wonder what kind of bird this is…”

Doing: The user presses the home button and is taken home to investigate the type of bird they’re seeing by looking at the information panel.

Thinking: “If I go home I can look at the information panel to learn about this bird.”

Doing: User looks left at information panel.

Thinking: “Looks like I can return to human size if I use the progress panel.”

Doing: Still at home, the robin-sized user turns around and notices the title wall, which contains the title information for SW as well as controller instructions.

Thinking: “It’s good to know these instructions are here.”