Труды КНЦ (Технические науки) 2/2022(13).

10. Wei Z. et al. ANovel Cascade Binary Tagging Framework for Relational Triple Extraction // Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Online: Association for Computational Linguistics, 2020. P. 1476-1488. 11. Wu S., He Y. Enriching Pre-trained Language Model with Entity Information for Relation Classification // arXiv: 1905.08284 [cs]. 2019. 12. Baldini Soares L. et al. Matching the Blanks: Distributional Similarity for Relation Learning // Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy: Association for Computational Linguistics, 2019. С. 2895-2905. 13. Han X., Wang L. A Novel Document-Level Relation Extraction Method Based on BERT and Entity Information // IEEE Access. 2020. Vol. 8. P. 96912-96919. 14. Sahu S. K. et al. Inter-sentence Relation Extraction with Document-level Graph Convolutional Neural Network // Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy: Association for Computational Linguistics, 2019. P. 4309-4316. 15. Fu T.-J., Li P.-H., MaW.-Y. GraphRel: Modeling Text as Relational Graphs for Joint Entity and Relation Extraction // Proceedings of the 57th Annual Meeting o f the Association for Computational Linguistics. Florence, Italy: Association for Computational Linguistics, 2019. P. 1409-1418. 16. Li X. et al. Entity-Relation Extraction as Multi-Turn Question Answering // Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy: Association for Computational Linguistics, 2019. P. 1340-1350. 17. Ding N. et al. Prototypical Representation Learning for Relation Extraction. 2021. P. 16. 18. Russian spaCy Models Documentation [Электронный ресурс]. URL: https://spacy.io/models/ ru#ru_core_news_sm (дата обращения: 15.06.2021). 19. Gensim: Word2Vec Model [Электронный ресурс]. URL: https://radimrehurek.com/gensim/ auto_examples/tutorials/run_word2vec.html#word2vec-model (дата обращения: 15.02.2022). References 1. Gruber T. R. A translation approach to portable ontology specifications. Knowledge Acquisition, 1993, vol. 5, no. 2, pp. 199-220. 2. Lomov P., Malozemova M., Shishaev M. Training and application of neural-network language model for ontology population. Software engineering perspectives in intelligent systems. In: Silhavy R., Silhavy P., Prokopova Z. (eds). Software Engineering Perspectives in Intelligent Systems. CoMeSySo 2020 Advances in Intelligent Systems and Computing, 2020, vol. 1295, Springer, Cham, pp. 919-926. 3. Lomov P., Malozemova M., Shishaev M. Data Augmentation in Training Neural-Network Language Model for Ontology Population. In: Silhavy R., Silhavy P., Prokopova Z. (eds). Data Science and Intelligent Systems. CoMeSySo 2021. Lecture Notes in Networks and Systems, 2021, vol. 231, Springer, Cham, pp. 669-679. 4. Hearst M. A. Automated Discovery of WordNet Relations. WordNet: An Electronic Lexical Database. MIT Press. Cambridge, 1998, 26 p. 5. Garcia M., Gamallo P. AWeakly-Supervised Rule-Based Approach for Relation Extraction, 2011, 10 p. 6. Mintz M., Bills S., Snow R., Jurafsky D. Distant supervision for relation extraction without labeled data. Proceedings o f the Joint Conference o f the 47th Annual Meeting o f the ACL and the 4th International Joint Conference on Natural Language Processing o f the AFNLP. Suntec, Singapore: Association for Computational Linguistics, 2009, pp. 1003-1011. 7. Ren X., Wu Z., He W. Qu M. CoType: Joint Extraction of Typed Entities and Relations with Knowledge Bases. Proceedings o f the 26th International Conference on World Wide Web. Perth Australia: International World Wide Web Conferences Steering Committee, 2017, pp. 1015-1024. 8. Implementation of the Brown word clustering algorithm. Available at: https://github.com/percyliang/brown-cluster (accessed 23.12.2021). 9. Devlin J., Chang M.-W., Lee K., Toutanova K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv:1810.04805 [cs], 2018. Труды Кольского научного центра РАН. Серия: Технические науки. 2022. Т. 13, № 2. С. 23-30. Transactions of the Kola Science Centre of RAS. Series: Engineering Sciences. 2022. Vol. 13, No. 2. P. 23-30. © Ломов П. А., Никонорова М. Л., Шишаев М. Г., 2022 29

RkJQdWJsaXNoZXIy MTUzNzYz