Deep Learning at Facebook AI Research

Home / Data Science & Machine Learning / Deep Learning at Facebook AI Research

The unrelenting thirst for us to pour as much data about ourselves into the vaults at Facebook has created the perfect nesting ground for Data Scientists to gather with thoughtful fingers tapping together and create the tools to manipulate it. When the other big machine learning houses of the world, Google, Baidu, Netflix and so on are all generating value from recycling data, being a data landfill organisation means that you could be left behind.

The question of artificial intelligence through neural nets seems to be a cyclical thing, historically having grown in popularity accompanied by hope and ambition, only to fade when the cogs of challenge started to grind. The current resurrection of research into neural networks through funding by the likes of Facebook creating their own laboratories has taken on a life of its own, a product of free market economics.

This means that Researchers that may have previously been embedded in academic research are now applying their work in helping organisations like Facebook and Google stay ahead of the competition.

Yann Le Cunn is known for his work in developing convolutional neural networks inspired by biological architecture, a technology which was still being used by major banks in the US throughout the so-called AI winter until the early 2000s, when LeCunn was at AT&T Labs and New York University.

When Mark Zuckerberg decided to point Facebook’s considerable resources at the creation of a research arm which would encompass deep learning, LeCunn was hired to head up the organisation, instantly injecting established and thoroughbred AI roots right into the core of Facebook.

Facebook AI Research (FAIR) have made their objective very clear – build the best AI lab in the world.

As part of this objective, FAIR feed some of their work into the Torch project, an open source development environment with an emphasis on deep learning and convolutional nets which is used at a variety of academic labs including Google’s DeepMind and Twitter.

In January Facebook open sourced the work it has been doing to improve the performance of convolutional neural networks under Torch. Since then, there have been two further iterations to a paper which accompanied this work and which was submitted for the International Conference on Learning Representations in Dec 2014.

Facebook have been able to achieve better performance with CNNs for their specific domain by customising an algorithm called fast fourier transform (FFT) which is used for mapping networks and representing complex functions as more easily digestible coefficients.

The FAIR implementation of FFT, fbFFT achieves 1.5x speedup over NVIDIA’s CuFFT library which in turn computes 10x faster than standard FFTs. The plan is for FAIR to release faster implementations as they become available accompanied by reduced training time.

Jeff Johnson is a Research Engineer at FAIR and gave a talk at MLConf in New York City last month in which he described the tools and techniques used for convolution in the context of the work at FAIR.

 

Related Posts