Home| Contact Us| New Journals| Browse Journals| Journal Prices| For Authors|

Print ISSN: 0976-3503
Online ISSN:
0976-2930


  About JET
  DLINE Portal Home
Home
Aims & Scope
Editorial Board
Current Issue
Next Issue
Previous Issue
Sample Issue
Upcoming Conferences
Self-archiving policy
Alert Services
Be a Reviewer
Publisher
Paper Submission
Subscription
Contact us
 
  How To Order
  Order Online
Price Information
Request for Complimentary
Print Copy
 
  For Authors
  Guidelines for Contributors
Online Submission
Call for Papers
Author Rights
 
 
RELATED JOURNALS
Journal of Digital Information Management (JDIM)
International Journal of Computational Linguistics Research (IJCL)
International Journal of Web Application (IJWA)

 

 
Journal of E-Technology

An Ensemble Learning-based Model for Classification of Insincere Question
Zhongyuan Han, Jiaming Gao, Huilin Sun, Ruifeng Liu, Chengzhe Huang, Leilei Kong, Haoliang Qi
Heilongjiang Institute of Technology Harbin, China., Harbin Engineering University Harbin, China
Abstract: This paper describes the method for the Classification of Insincere Question(CIQ) in FIRE 2019. In this evaluation, we use an ensemble learning method to unite multiple classification models, including logistic regression model, support vector machine, Naive Bayes, decision tree, K-Nearest Neighbor, Random Forest. The result shows that our classification achieves the 67.32% accuracy rate(rank top 1) on the test dataset.
Keywords: Classification, Insincere Question, Ensemble Learning An Ensemble Learning-based Model for Classification of Insincere Question
DOI:https://doi.org/10.6025/jet/2020/11/2/64-69
Full_Text   PDF 97 KB   Download:   197  times
References:

[1] Smyth, Padhraic, and David Wolpert. Linearly combining density estimators via stacking. Machine Learning 36.1-2 (1999): 59-83.

[2] Cox, David R. (1958). The regression analysis of binary sequences. Journal of the Royal Statistical Society: Series B (Methodological) 20.2, 215-232.

[3] Jiang, Mingyang, et al. (2018). Text classification based on deep belief network and softmax regression. Neural Computing and Applications 29.1, 61-70.

[4] Cortes, Corinna, Vladimir Vapnik. (1995). Support-vector networks. Machine learning 20.3: 273-297.

[5] Scholkopf, Bernhard, and Alexander J. Smola. Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT Press, 2001.

[6] Hsu, Chih-Wei, Chih-Jen Lin. (2002). A comparison of methods for multiclass support vector machines. IEEE transactions on Neural Networks 13.2, 415-425.

[7] Ng, Andrew Y., Michael I. Jordan. (2002). On discriminative vs. generative classifiers: A comparison of logistic regression and naive Bayes. Advances in neural information processing systems.

[8] Weinberger, Kilian Q., Lawrence K. Saul. (2009). Distance metric learning for large margin nearest neighbor classification. Journal of Machine Learning Research 10. February: 207- 244.

[9] Quinlan, J. Ross. (1986). Induction of decision trees. Machine learning 1.1, 81-106.

[10] Liaw, Andy, Matthew Wiener. (2002). Classification and regression by random Forest. R news 2.3: 18-22. 

[11] Pedregosa, Fabian, et al. (2011). Scikit-learn: Machine learning in Python. Journal of machine learning research 12.October, (2011), 2825-2830.


Home | Aim & Scope | Editorial Board | Author Guidelines | Publisher | Subscription | Previous Issue | Contact Us |Upcoming Conferences|Sample Issues|Library Recommendation Form|

 

Copyright © 2011 dline.info