cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Scientists Figured Out How to Make Neural Networks 90 Percent Smaller—but Just as Smart

Now that AI can get the job done without the dead weight, the applications could be huge.

A pair of MIT researchers discovered a way to construct an artificial intelligence that’s only about a tenth the size, without losing any computational ability. The breakthrough could allow other researchers to build AI that are smaller, faster, and just as smart as those that exist today.

When people talk about artificial intelligence, they’re mostly referring to a class of computer programs called artificial neural networks. These programs are designed to mimic how our own brains operate, making them very intelligent and creative. They can identify the contents of photos, defeat humans at abstract games of strategy, and even drive vehicles all by themselves.

At their core, the programs are made up of collections of ‘neurons,’ just like in our own brains. These neurons are connected to a random number of other neurons. Each individual neuron can only perform a handful of basic calculations, but with enough of them all connected together, the computational power of the network is essentially limitless.

The most important thing for a solid neural network is the connections between neurons. Good connections make a good network, but bad connections leave you with nothing but junk. The process of making those connections is called training, and it’s similar to what our own brains do when we learn something new.

More in AI
Humanoid Robot Sophia In Toronto
'Fake News' Is Sparking an AI Arms Race
Salon Eurosatory 2018
Why the U.S. Is Backing Killer Robots
image
An AI Beat the Top Humans at a Modern Video Game

The only difference? Our brains regularly trim old connections that aren’t useful anymore, in a process called ‘pruning.’ We prune old or disused connections all the time, but most artificial neural networks are only pruned once, right at the end of training.

So the MIT researchers decided to try something new: prune the network regularly during training. They found that this method produced neural networks that were just as good as networks trained using the standard method, but these pruned networks were around 90 percent smaller and much more efficient. They also needed less training time and were more accurate.

In the near future, researchers might use this pruning method to design even better neural networks. These networks could be powerful and lightweight, so people could use them with smaller electronic devices. And in time, we could have neural networks operating nearly everywhere.

Scientists Figured Out How to Make Neural Networks 90 Percent Smaller—but Just as Smart 

0 Replies