Posts Tagged ‘Neural Networks’

Google has announced the open source release of TensorFlow . This is their second-generation machine learning system building on work done in the DistBelief project. TensorFlow is general, flexible, portable, easy-to-use, and completely open source. TensorFlow is twice as fast as DistBelief.

To understand what is possible, Google’s internal deep learning infrastructure DistBelief, developed in 2011, has allowed Googlers to build ever larger neural networks and scale training to thousands of cores in our datacenters. It has been used to demonstrate that concepts like “cat”can be learned from unlabeled YouTube images, to improve speech recognition in the Google app by 25%, and to build image search in Google Photos. DistBelief also trained the Inception model that won Imagenet’s Large Scale Visual Recognition Challenge in 2014, and drove our experiments in automated image captioning as well as DeepDream.

While DistBelief was very successful, it had some limitations. It was narrowly targeted to neural networks, it was difficult to configure, and it was tightly coupled to Google’s internal infrastructure — making it nearly impossible to share research code externally.

Tensorflow is build on Python as is alot of Google infrastructure. You can download the libraries/package and run it within your own python applications. Get started today!

Read more

Kudos to Brittany Wenger from Lakewood Ranch, USA for winning Google’s Science Fair Grand prize.Using a 6-node Artificial Neural Network (see her slides), and alot of cloud computing power, Brittany has managed to train the neural network to detect maligned breast tumors with an accuracy of 99.11%

Now, what is notable is that this girl is 17 years old. I was talking to some parents recently about how the amount of new knowledge being generated today is in the exponential scale. What this means is that they next generation of kids will have to learn more and in less time. Now, I am sure neural networks have been implemented by geniuses far younger than 17 years.

The comparison I would like to make here is that I learnt neural networks at age 20 (and with minimal successful commercial application), and as Britanny has a successful implementation of a neural network at age 17, I would now say that:

My kids will probably be implementing neural networks at age 14-15 Artificial intelligence is going to be more commonplace in the future.