FACEBOOK HAS confirmed that it has switched to using its neural network systems to carry out translations on its social network.
The Caffe2 deep learning framework is the basis of the system. The company made it available for general use as an open source project in the Spring, pitching it against Google's Tensorflow and Nvidia GPU offerings.
The work in conjunction with Facebook AI Research (FAIR) has also seen Recurrant Neural Networks added to Facebook's AI repertoire.
Here's the Caffe2 team to explain: "Using Caffe2, we significantly improved the efficiency and quality of machine translation systems at Facebook. We got an efficiency boost of 2.5x, which allows us to deploy neural machine translation models into production.
"As a result, all machine translation models at Facebook have been transitioned from phrase-based systems to neural models for all languages. In addition, several product teams at Facebook, including speech recognition and ads ranking, have started using Caffe2 to train RNN models."
And if you need more than that, read the blog post, it goes into spectacular detail.
"Caffe2 provides a generic RNN library where the RNN engine is an almost zero-overhead virtual box for executing arbitrary RNN cells. Under the hood, a cell is just a smaller Caffe2 network and benefits from all typical Caffe2 performance advantages.
"We also have a rich set of APIs that let people use existing RNNCells and implement new ones using Python. MultiRNNCell allows for easy composition of existing cells into more complex ones. For example, you could combine several layers of LSTMCells and then put an AttentionCell on top."
Yeah. More like that. Good luck.
It also uses the word "blobs" which made us very happy on a Friday afternoon.
Speaking of which, the change has, we're told, made it more likely that translations of slang, typos and inference from context will work. Which will ruin anyone who likes running things through a translator and back again.
Facebook was recently rumoured to have had to abandon an experiment after its AI went rogue. And yes, we reported it. It didn't and it wasn't and then we all had ice cream. µ
SiliconX can achieve up to five times the charge capacity of current technology
This is now the tech equivalent of leaving the oven on
Phablet might not be much more expensive than its predecessor
Long-standing issue causes app to crash on launch