100x Improvements in Deep Learning Performance with Sparsity | TWIML AI Podcast

On This Week in Machine Learning, our VP of Research Subutai Ahmad sat down with Sam Charrington to discuss the fundamental neuroscience research behind Numenta’s work. They discuss sensorimotor models and what it means for a model to have inherent 3d understanding. Then they explore the topic of sparsity, detailing the differences between sparse and dense networks, and how we are applying sparsity and to drive greater efficiency in current deep learning networks.

Find more details here.

Video

Authors

Subutai Ahmad • CEO

Share