I like outer space, building virtual reality apps, geometric design, sketching and painting cards, and ice cream.
NASA Ames
Objective Function Allocation for Human-automation/Robotic Interaction
Designed and researched human and robotic systems interactions in deep space through cognitive methods to evaluate systems for improvements in task load allocation and work analysis.
Revamped architectural design and framework for our proprietary simulation software for 20+ developers across NASA, Honeywell, University of Buffalo, Georgia Tech, and Drexel University.
Created a series of human and robot simulations in C++ to investigate key metrics of the task-load, authority-responsibility mismatches, and coherency of deep space HRI tasks.
Developing VR system to track human eye movements and robotic interaction scenario for HRI analysis.
Cofounder, researched music visualization and prototyped early stage processing.js/THREE.js dynamic music visualizer for VR light shows with lights and graphic visuals to provide an immersive music experience start-up for individuals and EDM concerts.
Dance VR
Lead team of four to design movement tracking jacket with Arduino IDE and accelerometer and flex sensors for dancers.
Prototyped WebVR visualization of dancer’s movement in three.js to enhance visual performances.