Computer and Modernization ›› 2021, Vol. 0 ›› Issue (07): 71-76.

Previous Articles     Next Articles

Intention Recognition and Classification Based on BERT-FNN

  

  1. (College of Sciences, Northeastern University, Shenyang 110004, China)
  • Online:2021-08-02 Published:2021-08-02

Abstract: Intention recognition classification is an important question in the field of natural language processing. How to understand the user’s intention based on context is a key and difficult problem in intelligent robots and intelligent customer service. Traditional intention recognition classification is mainly based on regularization methods or machine learning methods. However, there are problems of high computational cost and poor generalization ability. In response to the above problems, the design of this paper is based on Google’s BERT pre-training language model to perform context modeling and sentence-level semantic representation of the text, uses the vector corresponding to the [cls] token to represent the context of the text, then, extracts the feature of sentences through fully-connected neural network (FNN). In order to make full use of the data, this paper uses the idea of disassembly method to convert the multi-classification problem into multiple binary classification problems. Each time, one category is used as a positive example, and the remaining categories are used as negative examples, which generates multiple two-classification tasks so as to achieve intention classification. Experimental results show that the performance of this method is better than the traditional model, and the accuracy of this method is 94%.

Key words: natural language processing, intention recognition, BERT, FNN, dismantling method