This wasn’t my first NIPS (I grew up in Denver, after all), but it was by far the largest. The fact that machine learning has become this popular really showed this year. People were everywhere. The deep learning sessions were crowded.
Below is a quick random sampling of my personal highlights. Gaussian, of course.
Deep Visual Analogy-Making
The presentation for this paper was good, particularity because it was presented by someone who knew how to present well. This topic in general, as the name suggests, is visual, so that helps too. Got many oohs and awws from the audience.
The saddle point problem was mentioned often, I felt more then previous years. Although not from this year, see this paper.
Ladder Networks were also talked about quite a bit, and they are certainly pretty cool. They are roughly, take a feedforward model which serves the supervised learning as the encoder. Add a decoder which can invert the mappings on each layer of the encoder, the supervised cost is then calculated using the corrupted encoder output and the target. Train the network in a semi-supervised setting with something like SGD.
Find more details here.
The highway network was also making the rounds.
GANs using Laplacian Pyramids
I had a chance to sit and talk with Emily for awhile. She’s incredibly smart, a lot of fun to talk to, and of course does great work.
Ending on a cool note. Reinforcement Learning was, of course, talked about a lot around the breakfast table / after parties. I saw quite a few people walking around with Sutton’s book.