Recurrent Neural Networks no longer the king of Natural Language Processing

Natural Language Processing

Natural Language Processing

Recurrent Neural Networks (RNNs) have been praised for their ability to keep short term memory, and long term memory with the Long Short Term Memory variation (LSTM). It has been the go to model for training Natural Language Processing due to the importance of context in making predictions of words and letters. Researchers have recently found a way to utilize a version of Convolutional Neural Networks (CNNs), called Gated CNN that has comparable performance to RNNs at a lower computational cost. CNNs claim to fame is reducing dimensionality of data making the operations much more cost efficient and is hugely popular with Image Detection, where you can break up major features while losing other unnecessary data. Source can be found below:

https://towardsdatascience.com/how-to-build-a-gated-convolutional-neural-network-gcnn-for-natural-language-processing-nlp-5ba3ee730bfb

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.