Труды КНЦ вып.12 (ИНФОРМАЦИОННЫЕ ТЕХНОЛОГИИ вып. 5/2021(12))
3. Chickering D., Geiger D., Heckerman D. Learning Bayesian networks: The combination of knowledge and statistical data Machine Learning. 1995, No. 20. pp. 197-243. 4. Heckerman D. Geiger D., Chickering D.M. Learning Bayesian networks: the combination of knowledge and statistical data. Machine Learning, 1995, No. 20, pp. 131-163. 5. Heckerman D. Bayesian Networks for Data Mining Data Mining and Knowledge Discovery. 1997, No. 1, pp. 79-119. 6. Friedman N., Geiger D., Goldszmidt M. Bayesian Network Classifiers Machine Learning. 1997, No. 29, pp. 131-165. 7. Minsky M. Shaghi k iskusstvennomu intellectu [Steps to Artificial Intelligence]. Proceedings ofthe IRE. 1961, No. 49, pp. 8-30. (In Russ.). 8. Mehta M., Shafer J., Agrawal R. SPRINT: A Scalable Parallel Classifier for Data Mining. Proceedings of the 22nd Int'l Conf. Very Large Data Bases, Morgan Kaufmann, San Francisco. 1996, pp. 544-555. 9. Chubukova I. Data Mining [Data Mining]. NOU INTUIT [NOU INTUIT], Available at https://loginom.ru/blog/decision-tree-p1. Accessed 18.11.2021). 10.B. Scholkopf, G. Ratsch, K. Muller, K. Tsuda, S. Mika An. Introduction to Kernel- Based Learning Algorithms. Proceedings of the IEEE Neural Networks, 2001, No. 12(2), pp. 181-201. 11.Hovland C. I. Computer simulation of thinking. American Psychologist, 1960, No. 15(11), pp. 687-693. 12.Hunt Earl B., Janet Marin, Philip J. Stone. Experiments in Induction. New York: Academic Press, 1966. 13.Quinlan J. R. Induction of decision trees. Machine Learning, 1986, No. 1(1), pp. 81-106. 14.Quinlan J. Ross. C4.5: Programs for Machine learning. Morgan Kaufmann Publishers, 1993. 15.Murtag F., Legendre P. Method ierarchical aglomerative klasterization Whard: kakie algorithms realizuyut kriterius Wharda? [Ward's method of hierarchical agglomerative clustering: which algorithms implement Ward's criterion?]. J Classif, 2014, No. 31, pp. 274-295. (In Russ.). 16.Sneath P. H. A. and Sokal R. R. Numerical Taxonomy: The Principles and Pratice of Numerical Classification. San Francisco: Freeman, 1973, 573 p. 17.Hartigan J.A. and Wong M.A. Algorithm AS 136 A K-Means Clustering Algorithm. Journal of the Royal Statistical Society. Series C (Applied Statistics), 1979, No. 28, p p . 100-108. 18.Ganti V., Gerke Y., Ramakrishnan R. Dobycha dannykh vie supercollus bazakh dannykh [Data Mining in Ultra-Large Databases]. Open Systems, 1999, No. 9-10. (In Russ.). 19.Zhang T., Ramakrishnan R., Livny M. BIRCH: an efficient data clustering method for very large databases. Proceedings of the 1996 ACM SIGMOD international conference on Management of data - SIGMOD '96, 1996. 20. Factor discriminant i klasterny analysis [Factor discriminant and cluster analysis]. Moscow, Finansy i statistik, 1989. 215 p. (In Russ.). 21.Musaev A.A. Algorithms analytical upravlenia proizvodstvennymi protsessamy [Algorithms of analytical management of production processes]. Automation in industry, 2004, No. 1, pp. 30-35. (In Russ.). 101
Made with FlippingBook
RkJQdWJsaXNoZXIy MTUzNzYz