
________________________________________________________________________________________
IDEATION SKETCHES

As part of Interaction Design coursework, I teamed up with classmates to develop a STEM VR laboratory experience from concept to prototype.
Working primarily as visual designer, I began with concept ideation sketch work for proposed multimedia and hardware applications; spending much time to refine interaction design principles throughout low, mid, and high fidelity wireframes.
Amid the COVID era, our team sought to draft up remote learning solutions to offer a more immersive laboratory experience for students. With the pandemic mandating stay-at-home orders, all education moved online, and school facilities remain unused.
But what does this mean for a STEM curriculum (Science, Technology, Engineering, and Math), which traditionally benefits from laboratory experimentation?

Knowing that physical experiments cannot be conducted on-screen with any true sense of realism or lasting educational value, we confirmed our assumptions with user research surveys among high school and college students. Insights revealed that without any tangible laboratory environment, lessons are almost entirely ineffectual.
How might we better empower our students to learn from home?
As we eventually narrow down our product design to VR implementation of remote, teacher-assisted laboratory experiments; we did not come to this decision lightly. There was much deliberation amongst the team regarding the best way to deliver such an experience.

We toiled with physical kits sent by mail, yet this would involve the possibility of hazardous materials being provided-to or sourced-by the user in order to conduct experiments at home. These solutions would all essentially be accompanied by pre-recorded instruction, and thus lacked oversight. Going further into physical device assistance for home experimentation, we considered a smart beaker/scale system, in addition to AR interactions on smart phone/tablet screens to provide instructional prompts.
Eventually we ruled out at-home experimentation, not only for the danger it may pose to inexperienced users, but also to better facilitate a group learning environment by structuring virtual labs on a specified schedule. Not having students conducting experiments at their leisure, but with live-instruction, proves to be vital to the learning experience.

Other third-party (and perhaps theoretical) device implementations were suggested, including a home-assist device used for narrating pre-recorded or live instruction; to be coupled with motion detection, Google Glass, gaming console integration, or further into fantasyland with a proposed holographic smart table.

If our laboratory experience was to remain virtual, the question became how can we make it better than your typical video lecture or Zoom class. The obvious solution was to utilize VR Goggles and Smart Glove technology that would ideally provide haptic responses for users serving to bolster the learning environment by stimulating the sense of touch.
Now we were onto something!
To design the best possible UI, we focused our research toward understanding user point points Before, During, and After lab experimentation. With the results from our study, we drafted the user story below.

________________________________________________________________________________________
LOW-FIDELITY WIRES

Our team's first stab at wire framing included a mobile-to-VR headset transition, and unfortunately a somewhat disjointed onboarding process. Providing descriptions and pre-recorded videos covering lab safety and direction seemed necessary. But these resources were quickly bypassed by users in testing. Learnability quickly became our focus for wire frame iterations.
Into VR frames, an overabundance of emphasis was placed on visibility of system status. Thus, experiment status bar was placed at front-center, perhaps obtrusively so.
A professor admin view was included and framed out to show individual student screens, step status, and query notifications. However, this frame was not explored beyond LoFi, because it was not integral to the user task at hand: Conduct a frog dissection.
Our prototype developed to explore how a user might absorb corrective instruction in a VR lab environment. A proposed solution took the form of holographic or projected guides as subject overlays. In the context of this particular frog dissection, this effort evolved into 'Incision Assist'.

________________________________________________________________________________________
MID-FIDELITY WIRES

Getting into mid-fidelity by week 3, we add a splash of color, more defined user appendages/tools, a dark mode background reminiscent of TRON, along with some other vital changes to the UI.
Our system status visibility issues were seemingly resolved with moving steps status bar toward the top of user view. But a second round of testing revealed a stark lack of guidance during the experiment. Many of these same users skipped over all preparatory onboarding pages disseminating instructions. As a result, they were lost in the lab.
It was always an assumption that instructions are provided verbally throughout the experiment, but for accessibilities sake, they must also be provided visually. By this iteration, visual queues were added to provide brief (but loud) disclaimers for direction. These prompts proved to be insufficient and intrusive to the user in testing.
Additional confusion circled around a final step that involved the submission of a laboratory report. Ultimately the concept of a post-lab report submission was scrapped and also deemed unnecessary.
________________________________________________________________________________________
HIGH-FIDELITY PROTOTYPE




