Reading Time: 8 minutes Hello everyone. It is now the greatest time of the year and here we are today, ready to to be amazed by Deep Learning. Last time, we have gone through a neural machine translation project by using the renowned Sequence-to-Sequence model empowered with Luong attention. As we already saw, introducing Attention Mechanisms helped improve the Seq2Seq model’s performance to a noticeably significant extent. For those who haven’t seen my last post, I recommend that you have …
Follow Me On