About | Project Members | Research Assistants | Contact | Posting FAQ | Credits

Matthew Turk

Professor of Computer Science; Chair, Media Arts and Technology; and Co-Director, Four Eyes Laboratory, UC Santa Barbara

Matthew Turk
Matthew Turk is Associate Professor of Computer Science at UC Santa Barbara, and is affiliated with UCSB’s Media Arts and Technology Program, an interdisciplinary program positioned at the convergence of arts, media, and technology. He received a B.S. from Virginia Tech in 1982, an M.S. from Carnegie Mellon University in 1984, and a Ph.D. from MIT in 1991. He worked on robot planning and vision for autonomous robot navigation (part of DARPA’s ALV program) in the mid 1980s. A paper on his dissertation research on automatic face recognition received an IEEE Computer Society Outstanding Paper award at the Conference on Computer Vision and Pattern Recognition in 1991; another paper from his thesis work received a “Most Influential Paper of the Decade Award” from the IAPR MVA2000 workshop. His current research concerns computer vision, human-computer interaction, and perceptual interfaces. He is co-director of the UCSB Four Eyes Lab, which focuses on research in “imaging, interaction, and innovative interfaces.” Professor Turk also serves on the Faculty Steering Committee for the Center for Information Technology and Society and the Cognitive Science Program. He is also a member of the UC Transliteracies Project and serves on the editorial board for the Journal of Image and Vision Computing. He is the chair of the ICMI Advisory Board. He has been involved in organizing many conference, most recently as general chair of ACM Multimedia 2006.

Links: Home page | UCSB Four Eyes Lab

Research Sample: HandVu: Vision-based Hand Gesture Interface, Mathias Kölsch, Matthew Turk, Tobias Höllerer http://ilab.cs.ucsb.edu/projects/mathias/handvu_ilab.html

HandVu Icon HandVu is a system developed at the Four Eyes Lab to enable a person’s hands to be used as input to a mobile wearable computing device. HandVu detects, tracks, and recognized key hand postures in real-time without the need for camera or user calibration. It works with unaugmented hands, moving cameras, dynamic backgrounds, changing lighting conditions, etc. Using HandVu, a mobile user can use simple hand gestures to control local computation — e.g., to drive a simple graphical user interface while moving about freely.

HandVu wearer
HandVu interface

  Matthew Turk, 04.23.05

Comments are closed.