Труды КНЦ вып. 11 (ИНФОРМАЦИОННЫЕ ТЕХНОЛОГИИ) вып. 8/2020 (11)
9. Bengio Y. A Neural Probabilistic Language Model / Y. Bengio, R. Ducharme, P. Vincent // Journal of Machine Learning Research. - 2003. - Vol. 3. - P. 1137-1155. 10.Sahlgren M. The distributional hypothesis / M. Sahlgren // Italian Journal of Linguistics. - 2008. - Vol. 20. - pp. 33-53. 11.Efficient Estimation of Word Representations in Vector Space / T. Mikolov [et al.] // Proceedings of the International Conference on Learning Representations (ICLR 2013). - 2013. - pp. 1-12. 12.RusVectores: О проекте [Электронный ресурс]. - URL: https://rusvectores.org/ru/ (дата обращения: 08.12.2020). 13.Pennington J. Glove: Global Vectors for Word Representation / J. Pennington, R. Socher, C. Manning // EMNLP Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). - 2014. - Vol. 14. - pp. 1532 1543. 14.Савинова А.О. Языки аналитические и синтетические / А.О. Савинова, Н.Д. Решетникова // Молодой ученый. - 2013. - № 59. - pp. 873-877. 15.Enriching Word Vectors with Subword Information / P. Bojanowski [et al.] // Transactions of the Association for Computational Linguistics. - 2016. - Vol. 5. - p p . 135-146. 16.Le Q. Distributed Representations of Sentences and Documents / Q. Le, T. Mikolov // 31st International Conference on Machine Learning, ICML 2014. - 2014. - Vol. 32. - pp. 1188-1196. 17.Collobert R. A unified architecture for natural language processing: Deep neural networks with multitask learning / R. Collobert, J. Weston // Proceedings of the 25th International Conference on Machine Learning. - 2008. - P. 160-167. 18.Caruana R. Multitask Learning / R. Caruana // Machine Learning. - 1997. - Vol. 28. - № 1. - pp. 41-75. 19.Ruder S. An Overview of Multi-Task Learning in Deep Neural Networks / S. Ruder // arXiv: 1706.05098 [cs, stat]. - 2017. 20.Weiss D. An Upgrade to SyntaxNet, New Models and a Parsing Competition / D. Weiss, S. Petrov. - 2017. 21.DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks / L. Kong [et al.] // arXiv: 1703.04474 [cs]. - 2017. - DRAGNN. 22.A novel neural topic model and its supervised extension / Z. Cao [et al.] // Twenty- Ninth AAAI Conference on Artificial Intelligence. - Citeseer, 2015. 23.Wang X. Neural Topic Model with Attention for Supervised Learning / X. Wang, Y. Yang // AISTATS Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR. - 2020. - pp. 1147-1156. 24.Recommendation system exploiting aspect-based opinion mining with deep learning method / A. Da’u [et al.] // Information Sciences. - 2020. - Vol. 512. - pp. 1279 1292. 25.Wang X. Combination of Convolutional and Recurrent Neural Network for Sentiment Analysis of Short Texts / X. Wang, W. Jiang, Z. Luo // Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers COLING 2016. - Osaka, Japan: The COLING 2016 Organizing Committee, 2016. - pp. 2428-2437. 26.Comparison of neural network architectures for sentiment analysis of Russian tweets / K. Arkhipenko [et al.] // Computational Linguistics and Intellectual Technologies 99
Made with FlippingBook
RkJQdWJsaXNoZXIy MTUzNzYz