Труды КНЦ вып.12 (ИНФОРМАЦИОННЫЕ ТЕХНОЛОГИИ вып. 5/2021(12))

Language Processing. Brussels, Belgium: Association for Computational Linguistics, 2018. С. 2843-2849. 13.Ju M., Miwa M., Ananiadou S. A Neural Layered Model for Nested Named Entity Recognition // Proceedings of NAACL-HLT 2018. , 2018. С. 1446-1459. 14.Chen Y. и др. A Boundary Regression Model for Nested Named Entity Recognition // ArXiv201114330 Cs. 2020. 15.Dadas S., Protasiewicz J. A Bidirectional Iterative Algorithm for Nested Named Entity Recognition // IEEE Access. 2020. Т. 8. С. 135091-135102. 16.Shibuya T., Hovy E. Nested Named Entity Recognition via Second-best Sequence Learning and Decoding // Trans. Assoc. Comput. Linguist. 2020. Т. 8. С. 605-620. 17.Huang Z. и др. Iterative viterbi A* algorithm for K-best sequential decoding // 50th Annual Meeting of the Association for Computational Linguistics, ACL 2012 - Proceedings of the Conference, 2012. С. 611-619. 18.Russian spaCy Models Documentation [Электронный ресурс]. URL: https://spacy.io/models/ru#ru_core_news_sm 19.Pre-trained embeddings - DeepPavlov 0.15.0 documentation [Электронный ресурс]. URL: http://docs.deeppavlov.ai/en/master/features/pretrained_ vectors.html#bert 20.News dataset from Lenta.Ru [Электронный ресурс]. URL: https://kaggle.com/ yutkin/corpus-of-russian-news-articles-from-lenta References 1. Lomov P., Malozemova M., Shishaev M. Training and Application of Neural- Network Language Model for Ontology Population. In: Silhavy R., Silhavy P., Prokopova Z. (eds) Software Engineering Perspectives in Intelligent Systems. CoMeSySo 2020. Advances in Intelligent Systems and Computing, 2020, Vol. 1295, Springer, Cham, pp. 919-926. 2. Wong W., Liu W., Bennamoun M. Ontology Learning from Text: A Look Back and into the Future. ACM Comput. Surv, CSUR, 2011, Vol. 44, pp. 1-36. 3. Finkel J. R., Manning C. D. Nested named entity recognition. Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Vol. 1 - Vol. 1 EMNLP ’09. USA: Association for Computational Linguistics, 2009, pp. 141-150. 4. Wang W. Y., Yang D. That’s So Annoying!!!: A Lexical and Frame-Semantic Embedding Based Data Augmentation Approach to Automatic Categorization of Annoying Behaviors using #petpeeve Tweets. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal: Association for Computational Linguistics, 2015, pp. 2557-2563. 5. Luque F. M. Atalaya at TASS 2019: Data Augmentation and Robust Embeddings for Sentiment Analysis. ArXiv190911241 Cs. 2019. 6. Coulombe C. Text Data Augmentation Made Simple By Leveraging NLP Cloud APIs. 2018. pp. 33. 7. Devlin J. et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. ArXiv181004805 Cs. 2018. 8. Sun Y., Jiang H. Contextual Text Denoising with Masked Language Model. Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019). Hong Kong, China: Association for Computational Linguistics, 2019, pp. 286-290. 33

RkJQdWJsaXNoZXIy MTUzNzYz