Congrats to lab members Greg Guitchounts and Ruth Fong on winning NSF Graduate Research Fellowships. The NSF awards fellowships to roughly 2,000 students each year, out of a pool of over 16,000 applicants.
Congrats to lab member Ruth Fong on being named a Rhodes Scholar. Ruth’s been doing some exciting new work extending our perceptual annotation approach to machine learning, which leverages neuroscience-derived data to build better machine learning systems. Ruth will continue her studies in machine learning and computer science at Oxford as part of this prestigious award.
We’d like to thank the Richard A. and Susan F. Smith Family Foundation for their generous support of my group’s work via the Smith Family Awards Program for Excellence in Biomedical Research. The Smith Family Foundation’s gift will support research on the nature of plasticity in the visual cortex of developing animals.
We’re pleased to announce that our new paper that was recently accepted to be published IEEE Transactions in Pattern Analysis and Machine Intelligence, a top venue in machine learning and computer vision. This work was a collaborative effort with the lab of Ken Nakayama (from the Department of Psychology), and it exemplifies the kind of boundary-crossing, interdisciplinary work that my lab likes to do.
In this work, we describe a new machine learning technique that we call “perceptual annotation”, wherein we use carefully measured data on human perceptual performance to constrain and guide a machine learning algorithm. At a high level, our technique works by teaching machines to make mistakes more like those that a human would make, and in doing so, we produce a more robust system that is better able to generalize to new examples. As a proof of concept, we describe a face detection application of technology and show that it achieves state-of-the-art performance, especially in conditions that are traditionally difficult for machine vision algorithms (unusual angles, occluded faces, out of focus images, etc.).
While this initial foray into perceptual annotation was launched using human behavioral measurement, we think that the sky is the limit on what kinds of biological data can be used to guide machine learning. We’re excited to keep pushing this work in new directions!
In the spirit of open science, we’ve started a new website, perceptualannotation.org, where we’ll be distributing code and data related to this project.
We’d like to extend a special thanks to Intel Corporation for supporting work in our group through a generous gift. These funds will help support a collaboration between our group, the group of HT Kung (Harvard Computer Science) and Intel Research to design and optimize brain inspired algorithms and hardware.
While my time at the Rowland Institute has been amazing, my (five-year, non-renewable) term here is up, and it’s time for me to move on.
Luckily, I won’t have to move far, as I will be joining the Dept. of Molecular and Cellular Biology and the Center for Brain Science over on the main campus here at Harvard. I’m extremely excited by the new opportunities that joining this community opens up, and I very much look forward to starting this fall!
I participated today in an interesting discussion about face recognition technology and related privacy issues on National Public Radio with host Marty Moss-Coane and U Penn Law Professor Anita Allen. Thanks to the entire Radio Times crew at WHYY for the stimulating discussion.
A link to a recording of the live show can be found here
I’m helping to organize a workshop at the IEEE Computer Vision and Pattern Recognition conference this year in Colorado Springs on “Biologically Consistent Computer Vision.” We received an impressive array of submissions, and the workshop promises to be a good opportunity for those interested in biologically-inspired vision to come together and share ideas. See the workshop site for more details.