GOOGLE HAS powered up its TensorFlow machine learning program, allowing the open source software to be run in a distributed model across multiple machines.
Version 0.8 may seem like a pretty underwhelming update for TensorFlow, but the new chops mean that its deep learning capabilities can suck data from large networks of machines simultaneously.
Machine learning gets smarter the more it is exposed to data, and the update could unlock a significant amount of potential for TensorFlow and its uses. But we don’t reckon it will evolve into a fully-fledged artificial intelligence that will rise up and make humans its pets anytime soon.
People keen to spread TensorFlow out to a cluster of machines can tap into the new libraries released by Google. The firm uses these to train its Inception neural network, and allows them to define their own distributed models for the machine learning software.
Google claimed that TensorFlow's distributed architecture gives it a high level of flexibility in how coders define models that train the software.
“This architecture makes it easier to scale up a single-process job to use a cluster, and to experiment with novel architectures for distributed training,” said Google’s research team.
Coders have gone pretty crazy over TensorFlow since Google made it open source last year, making it the most popular machine learning software framework on GitHub. So the addition of new features should see developers and coders clamouring to get their hands on it. We only hope they use the new capabilities responsibly.
Google is quite happy chowing on its own dog food and uses TensorFlow in its Translate and Photos services. Pushing out TensorFlow as open source software with more capabilities should see developers building new functions which Google can then use to improve its own smart services. We guess it gives with one hand and takes with the other.
But before Google or other software makers get too crazy about using community-contributed code built on top of TensorFlow, it’s worth remembering that too many cooks can spoil the broth, or in this case unknowingly leave huge security vulnerabilities in widely distributed code. µ
A surprisingly busy week in a quiet month
Measures just 15.75mm at its thickest point
Firm expects GPU sales to start drying up