To examine visual feedback in multi-touch interaction design, I built a visual programming environment for multi-touch tablet for assembling interfaces. This serves as a platform to explore visual feedback in multi-touch interactions. Multiple visual feedback paradigms are implemented on top of a common core visual vocabulary, consisted of visual entities such as regions and links and containers.
This environment can be used to rapidly construct musical instruments, sample tasks were designed and used in human-subject usability study. In these example, a musical keyboard and multi-touch mixer instrument is built.
Video: Task 1,
Video: Task 2
Tools: urMus, Lua, iOS
Publication pending
Using Kinect Depth sensor to augment traditional keyboard instrument with a 3D gesture space, and top-down projection is used for visual feedback at the site of the gesture interaction.
This novel interaction model enables us to explore different visualizations:
Tools: Kinect, Processing, OpenFrameworks, OpenCV
Evaluating Gesture-Augmented Piano Performance, Qi Yang, Georg Essl CMJ 2014 PDF
Visual Associations in Augmented Keyboard Performance, Qi Yang, Georg Essl NIME 2013
Augmented Piano Performance using a Depth Camera, Qi Yang, Georg Essl NIME 2012
Visualizing contact network between 6 dormitories, by hour of day
As part of the ExFlu study by University of Michigan School of Public Health, I cleaned and analyzed multi-sensory data collected from 100 phones over 3 month. These include bluetooth and wifi contacts, accelerometer, and battery. Between-phone Bluetooth contact data are used to visualize social contact between study participants.
I also coordinated the collection of GPS position of local wifi access points. These data enabled me to localize and visualize activities on-campus and heat spots.
Tools: Python, MSSQL, kml, matplotlib
I developed as part of the web development team of Harvest Mission Community Church. I also lead the upcoming redesign of the web presence of the nonprofit organization.
Tools: PHP, html+CSS, Sketch
In collaboration with Sang Won Lee, I designed user interface and product concepts for a web-based writing application and corresponding mobile app which supports timed playback of the writing process, as well as enabling rich text-based expressions. The companion mobile app can capture nuanced typing gestures to enrich texting-like communications.
As a Computer Network Security course project, we created a mobile phishing attack and proof-of-concept defense against the attack for computer security graduate course project. The demonstration attack targets the iOS mobile browser by mimicking the native application user interface to pass casual observation. The defense uses a keyboard input monitoring to intercept possibly sensitive information.