1 min read

A foolproof way to shrink deep learning models

As more artificial intelligence applications move to smartphones, deep learning models are getting smaller to allow apps to run faster and save battery power. Now, MIT researchers have a new and better way to compress models. Read along!
A foolproof way to shrink deep learning models

Probyto's Daily Knowledge Bytes - 125

As more artificial intelligence applications move to smartphones, deep learning models are getting smaller to allow apps to run faster and save battery power. Now, MIT researchers have a new and better way to compress models. Read along!

#ArtificialIntelligence #Deeplearning #Models #MobileDevices #MachineLearning #Probyto

Source: http://news.mit.edu/…/foolproof-way-shrink-deep-learning-mo…
Follow us:
LinkedIn: https://www.linkedin.com/company/probyto/
Twitter: https://twitter.com/probyto
Instagram: https://www.instagram.com/probyto/
Facebook: https://www.facebook.com/probyto/