Open Access
Issue
MATEC Web Conf.
Volume 232, 2018
2018 2nd International Conference on Electronic Information Technology and Computer Engineering (EITCE 2018)
Article Number 01005
Number of page(s) 7
Section Network Security System, Neural Network and Data Information
DOI https://doi.org/10.1051/matecconf/201823201005
Published online 19 November 2018
  1. T Michalski, E Gołebiowska. Taxonomy methods in credit risk evaluation[J]. International Advances in Economic Research, 2(4):409-412 (1996). [CrossRef] [Google Scholar]
  2. N Dardac. Credit Institutions Management Evaluation using Quantitative Methods[J]. Theoretical & Applied Economics, 2(497): 35-40(2006). [Google Scholar]
  3. E Brynjolfsson, A Mcafee. Big Data’s Management Revolution[J]. Harvard Business Review, 90(10):60 (2012). [Google Scholar]
  4. Mitchell. Machine Learning[M]. China Machine Press ;McGraw-Hill Education (Asia), (2003). [Google Scholar]
  5. N.S Altman, “An introduction to kernel and nearest-neighbor nonparametric regression”. The American Statistician. 46 (3): 175–185 (1992). [Google Scholar]
  6. Rish, Irina, An empirical study of the naive Bayes classifier. IJCAI Workshop on Empirical Methods in AI (2001). [Google Scholar]
  7. “Artificial Neural Networks as Models of Neural Information Processing | Frontiers Research Topic”. Retrieved 2018-02-20. [Google Scholar]
  8. J R Quinlan, “Simplifying decision trees”. International Journal of Man-Machine Studies. 27 (3): 221(1987). [CrossRef] [Google Scholar]
  9. R. Quinlan, “Learning efficient classification procedures”, Machine Learning: an artificial intelligence approach, p. 463-482(1983). [Google Scholar]
  10. P E Utgoff, Incremental induction of decision trees. Machine learning, 4(2), 161-186(1989). [CrossRef] [Google Scholar]
  11. Y Freund; E R Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting”. Journal of Computer and System Sciences. 55: 119(1997). [CrossRef] [Google Scholar]
  12. L Breiman, “Bagging predictors”. Machine Learning. 24 (2): 123–140(1996). [Google Scholar]
  13. Shinde, Amit, A Sahu, D Apley, and G Runger. “Preimages for Variation Patterns from Kernel PCA and Bagging.” IIE Transactions, Vol.46, Iss.5(2014). [CrossRef] [Google Scholar]
  14. Ho, T Kam, Random Decision Forests. Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, QC, pp. 278–282(1995). [Google Scholar]
  15. Ho T Kam, “The Random Subspace Method for Constructing Decision Forests”. IEEE Transactions on Pattern Analysis and Machine Intelligence. 20 (8): 832–844(1998). [CrossRef] [Google Scholar]
  16. L Rokach; O Maimon, Data mining with decision trees: theory and applications. World Scientific Pub Co Inc (2008). [Google Scholar]
  17. L Breiman; J H Friedman.; R A Olshen; C J Stone, Classification and regression trees. Monterey, CA: Wadsworth & Brooks/Cole Advanced Books & Software (1987). [Google Scholar]
  18. J Gareth; D Witten; T Hastie; R Tibshirani, An Introduction to Statistical Learning. New York: Springer. p. 315 (2015). [Google Scholar]
  19. D Opitz.; R Maclin, “Popular ensemble methods: An empirical study”. Journal of Artificial Intelligence Research.(1999). [Google Scholar]
  20. L Breiman, “Random Forests”. Machine Learning. 45 (1): 5–32(2001). [CrossRef] [Google Scholar]
  21. T Hastie; R Tibshirani; J Friedman, The Elements of Statistical Learning (2nd ed.). Springer(2008). [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.