In recent times, we have witnessed dramatic progresses and emergence of advanced deep neural architectures in natural language processing (NLP) domain. The advanced sequence-to-sequence (seq2seq)/transformer based architectures have demonstrated remarkable improvements in multiple NLP’s tasks, including text categorization. |