No
Yes
View More
View Less
Working...
Close
OK
Cancel
Confirm
System Message
Delete
Schedule
An unknown error has occurred and your request could not be completed. Please contact support.
Scheduled
Wait Listed
Personal Calendar
Speaking
Conference Event
Meeting
Interest
Schedule TBD
Conflict Found
This session is already scheduled at another time. Would you like to...
Loading...
Please enter a maximum of {0} characters.
Please enter a maximum of {0} words.
must be 50 characters or less.
must be 40 characters or less.
Session Summary
We were unable to load the map image.
This has not yet been assigned to a map.
Search Catalog
Reply
Replies ()
Search
New Post
Microblog
Microblog Thread
Post Reply
Post
Your session timed out.
This web page is not optimized for viewing on a mobile device. Visit this site in a desktop browser to access the full set of features.
2017 GTC San Jose

S7553 - Exploring Sparsity in Recurrent Neural Networks

Session Speakers
Session Description

Recurrent neural networks are widely used to solve a variety of problems. As the quantity of data and the amount of available compute have increased, model sizes have also grown. We'll describe an approach to reduce the parameter count of RNNs using a simple pruning schedule without increasing the training time. The reduction in parameters achieves two goals. It helps reduce the size of the neural network, allowing it to be deployed on mobile and embedded devices. It also helps speed up evaluation time for inference. We'll demonstrate how this technique works for vanilla RNNs and the more complex gated recurrent units.


Additional Session Information
Intermediate
Talk
Algorithms Deep Learning and AI
Software
25 minutes
Session Schedule