Who Am I

I am a self-motivated quick learner who loves challenges. I enjoy exploring the intersection of computing and human behavior, and take pride in participating in software dev at every level informed by user-centered design.

Take a look at some of the projects I’ve worked on, and see my CV for more detail. Feel free to contact me for more information, ideas, or just to say hi!

A Few Projects:

VisMe

Media Coverage

 http://www.ncsa.illinois.edu/news/story/blue_waters_intern_visualizes_a_career_in_app_development

This project is a collaboration between Dr. Alan Craig and I. We accomplished developing a pipeline for visualizing any PDB protein using Augmented Reality. Published here. We began VisMe in a NSF funded internship hosted by Shodor, and since the conclusion of that internship we have continued to grow this project. We are excited to release this project and have more on the way. Release date (Android and iOS) TBA.

This project combines many techniques that come from my multidisciplinary background. I used Unity 3D and an allocation on the Blue Waters Supercomputer, molecular data from the PDB, scripting languages such as c#, javascript, tcl, cgi, and managed both ends of querying a server. I also employed an iterative design process, and surveyed real professors, researchers, and students to inform the design and functionality of the application.

SCIPR

The Articulab’s SCIPR project aims to understand how to foster curiosity in students by observing them in small group settings. I am taking part in the development of an embodied conversational agent (ECA), in our study an intelligent virtual child, to engage in a collaborative tabletop game, and elicit  curiosity during the game play. The virtual child’s behavior will be integrated as part of a WoZ system, which allows the above behaviors in a semi-automated matter. Find more about my experience here. Working under PI Dr. Justine Cassell, supervisor Dr. Zhen Bai.

For this project, I took part in a very iterative design process that involved observation of audio/video data, transcription, coding schemes, creating finite state machines, and frequent play testing.  In addition to this, I used several NLP techniques to extract and analyze speech patterns of real children in order to build finite state machines for an AI system. The exploratory work I did as this project gathers steam lays the groundwork for the AI system that will interact with real children in the future.

AR Theater

I am taking part in the development of AR and VR systems for scenic designers in theater spaces. Our team is comprised of developers, UX researchers, and scenic designers who are working towards creating an immersive tool to help streamline and simplify communication between members of theater production teams. I am working on development for AR on iOS and android using ARKit and ARCore, as well as Hololens and Windows Mixed Reality devices.

Media Coverage

Ingenuity:

http://engineering.uci.edu/news/2018/6/ingenuity-2018-recognizes-influential-individuals-and-celebrates-student-innovation

Butterworth Product Development Competition:

https://www.facebook.com/UCIBrenICS/posts/10155670534906909

VR Gait Study

This project was done in collaboration with the CS 498 course at University of Illinois at Urbana-Champaign and Dr. Manuel Hernandez in the Kinesiology & Community Health department  to study gait using virtual reality. In this project we used Unity 3D, C#, and javascript.