Труды КНЦ вып.12 (ИНФОРМАЦИОННЫЕ ТЕХНОЛОГИИ вып. 5/2021(12))

улучшить качество обученной модели при использовании аугментированного набора меньшего размера, чем при использовании текущего подхода. Примечания * Адаптированный перевод статьи: Lomov P.A. Data Augmentation in Training Neural-Network Language Model for Ontology Population / P.A. Lomov, M.L. Malozemova, M.G. Shishaev // Data Science and Intelligent Systems: Lecture Notes in Networks and Systems / ed. R. Silhavy. - Cham: Springer International Publishing, 2021. - pp. 669-679 Список литературы 1. Lomov P., Malozemova M., Shishaev M. Training and application of neural-network language model for ontology population // Software engineering perspectives in intelligent systems / под ред. R. Silhavy, P. Silhavy, Z. Prokopova. Cham: Springer International Publishing, 2020. T. 1295. С. 919-926. 2. Wong W., Liu W., Bennamoun M. Ontology Learning from Text: A Look Back and into the Future // ACM Comput. Surv. - CSUR. 2011. Т. 44. С. 1-36. 3. Finkel J. R., Manning C. D. Nested named entity recognition // Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1 EMNLP ’09. USA: Association for Computational Linguistics, 2009. С. 141-150. 4. Wang W. Y., Yang D. That’s So Annoying!!!: A Lexical and Frame-Semantic Embedding Based Data Augmentation Approach to Automatic Categorization of Annoying Behaviors using #petpeeve Tweets // Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal: Association for Computational Linguistics, 2015. С. 2557-2563. 5. Luque F. M. Atalaya at TASS 2019: Data Augmentation and Robust Embeddings for Sentiment Analysis // ArXiv190911241 Cs. 2019. 6. Coulombe C. Text Data Augmentation Made Simple By Leveraging NLP Cloud APIs // 2018. С. 33. 7. Devlin J. и др. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding // ArXiv181004805 Cs. 2018. 8. Sun Y., Jiang H. Contextual Text Denoising with Masked Language Model // Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019). Hong Kong, China: Association for Computational Linguistics, 2019. С. 286-290. 9. Wu X. и др. Conditional BERT Contextual Augmentation // Computational Science - ICCS 2019 Lecture Notes in Computer Science. / под ред. J. M. F. Rodrigues и др. Cham: Springer International Publishing, 2019. С. 84-95. 10.Kang M., Lee K., Lee Y. Filtered BERT: Similarity Filter-Based Augmentation with Bidirectional Transfer Learning for Protected Health Information Prediction in Clinical Documents // Appl. Sci. 2021. Т. 11. С. 3668. 11.Zhang J. и др. Enhancing HMM-based biomedical named entity recognition by studying special phenomena // J. Biomed. Inform. 2004. Т. 37. № 6. С. 411-422. 12.Sohrab M. G., Miwa M. Deep Exhaustive Model for Nested Named Entity Recognition // Proceedings of the 2018 Conference on Empirical Methods in Natural 32

RkJQdWJsaXNoZXIy MTUzNzYz